Attacking Doorway/Thin Content pages?
-
What's the best way to approach fixing thin "city+services" pages? Would recommend doing one page at a time? Or doing a little on a bunch of pages at a time?
For example, rewrite one page with 1,000 words of unique content, adding city specific images/videos of services rendered and local testimonials over the course of a week? Then go to the next page the following week?
Or, one week add city specific images/videos to all the pages you can? Then, the next week add something else to all the pages?
I'm trying to figure out the best way to scale this, and also which way google/search engines would prefer/look more kindly at?
Thanks,
Ruben
-
Thanks!
-
Good question. That is a very important thing to do right. I use this before my tag.
-
Quick followup question. Should I noindex/nofollow the pages or noindex/follow the pages?
-
Thanks, EGOL!
-
If a site has a Panda problem or the potential of a Panda problem, then time is very important. The best thing to do is to quickly noindex all of the thin pages to get them out of the index. That will allow the pages that remain in the index to be as strong as possible, avoid Panda, and recover
After that you want to get the noindexed pages back into action as quickly as possible so that you enjoy the traffic and income that they produce. If yours is the kind of content that can be outsourced and still retain quality then it might make sense to outsource that work and get those pages back to earning money right away. The quick return to action of those pages will fund the content development. If you are the only person who can do that writing and have limited time, then I would write the most valuable ones first, one-at-a-time properly, to avoid thin content on the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Category pages are treated as duplicate content - is that a problem?
Hi there I have analyzing a webshop where we sell products for pets, gardening and the like. I am getting a lot of "Duplicate Content" alerts from Moz when doing a site crawl and I am told that the pages for e.g. cat products and gardening tools show duplicate content. Those two pages contain no identical products, so I am guessing that it is just the "set up" of the page (they look almost identical, except for the products). My question is: Is this really a problem? Does it affect my ranking in a negative way, and if so, how can I counter it? Best regards Frederik
Local SEO | | fhertzp0 -
Website pages for different cities
HI, Is it ok to have pages on my website that are specifically targeted for one town? Does Google frown on this? is there any thing I should know about doing this, like, does it hurt my seo efforts in anyway? I would be making each page as resourceful as possible to the reader., by Including unique content etc Thanks in advance:)
Local SEO | | MissThumann0 -
Best Practice For Multisite Targeting Different States With Same Content
I am auditing a Joomla website that uses the MightySites component to create multiple versions of the same site for different state/province areas. For example, the site structure looks something like: example.com/fl/
Local SEO | | MatShepSEO
example.com/mn/
example.com/ny/
example.com/wa/ etc. Each of the state home pages are largely identical and much of the content within each state sub-folder is a copy of the original content on the main example.com site, with minor changes here and there. The client is a national organization and needs to keep this structure to allow each state to be able to edit and change their own content, though as far as I can see content doesn't actually vary much. What's best practice here in reducing duplicate content issues? We can't use hreflang as it is all within one country (although it does also provide two different language versions of content, for which I will use hreflang.) Should we just canonical everything back to the corresponding pages on the example.com site? Any thoughts or recommendations much appreciated.0 -
SEO and IP based content
Hello, We are building a guide/directory that will service multiple cities across Canada. Currently, our home page will detect your IP, and display local content on the home page. Although we feel this is incredibly useful to the end user, we are worried about how search engines will interpret our home page. In addition to our home page, should we have landing pages for each city that we are in? and should we follow site structure like this? www.thesite.com/vancouver So if a user from Vancouver goes to our home page, they will see Vancouver related content, but how would a search engine see the home page? We would like to know the best approach to placing well for searches in different Canadian cities. Most of our searches will be city specific: Calgary widgets, Vancouver widgets, etc. Thanks
Local SEO | | ebk0 -
What can I do to rank higher than low-quality low-content sites?
We lost our site in an actual meltdown at our hosting provider in January, and decided to do a new site instead of bring back a dated backup. So we've only been "active" at our URL since about May. That said, I have not seen any irregular or unexpected penalties. Not showing up is natural if you have literally nothing to show. We have had a site since then, though, and while it isn't going to win any award, we've built it with best practices using sites like this, trying to use natural, helpful, actual language to convey what we do and why we do it (we're web developers for small business making WordPress sites). Paying attention to titles, keyword frequency and variability, alt tags, etc. Always erring on the conservative side. While we build sites for people across the country (and a few in places like the UK), we just moved into an actual office space in our hometown so it's never been more important to push our visibility locally. We've just come back on the scene, in relative terms, so there's no expectation we'll crack the top five or ten; they all have teams of people and bags of capital and have been around many, many years, plus they link to the dozens upon dozens of sites they have done and promote their appearances in press releases and such. Their content is not bad, and most of it is good and not spammy. They are being genuine. That said, we're in the late 40s to late 50s right now. Happy to show up at all, but after that first group of legitimate sites, there are automatically generated webpages (which I thought couldn't even be listed...one is an MP3 download site that mentions one of the top companies in the page title, and just has a random video on the page) local companies touting themselves as SEO "experts" that say things like "Here at Company X, we work hard to bring you the best Rochester, NY web design in the hopes that when you make your Rochester, NY web design decisions, you'll think of us first Rochester, NY web design." I changed the company name and the location, but that's an actual line from their site job listings from places like Craigslist and Indeed hair stylists dentists (?!) Our code validates, we've incorporated Schema for our addresses, our site is usually fast (650ms to 1.3s in Pingdom from Dallas). We don't do any redirecting, our metas likes everyone else's don't count for ranking but are thoughtfully produced, we pay attention to using concise and accurate URLs without stop words, etc. There are also very very few resources loaded on a given page. That said, there's not a lot on the blog that's new and all told we have I think 13 total pages including a few posts. Is it even possible to get close to the actual pack if we, for example, posted more regularly? I was just reading here about how we shouldn't put our links in the site footers of our clients (which we don't always anyway), so I have them only as branded links, only on the homepages, and only on sites that, when crawled, didn't have nonzero spam scores (everyone else has a nofollow link in our portfolio). I realize this is a super generic question but I wasn't quite sure how to search out this particular use case given that our aspirations are so basic...just trying to figure out if there's something obvious we're missing and shooting ourselves in the foot over. A thousand pledges of gratitude! (if this is too common and I just didn't see a duplicate, let me know and I will delete it or ask for it to be deleted....also, I don't want to appear spammy so I am not linking to my site unless it's absolutely necessary...not sure what protocol is...I'm pretty self-aware so I do believe everything I've said above is true).
Local SEO | | eaglenestmedia1 -
Do not understand why a page will not rank- AT ALL!
I have a business that provides on-site services out of a central location. For instance in Denver, we have a warehouse location where we can perform repairs but the majority of business is performed within a 50 mile radius of the city center- on-site, a homes, businesses, etc. Our Google local page is set up to reflect this (mobile service as well as physical location). In order to capture business organically within this 50 mile veil, we have set up city specific service pages on our site to reflect the more than 30 municipalities that fit within the 50 mile veil. This strategy seems to work pretty well in Colorado but in Minnesota we are not having the same outcome. The following city page is created specifically for the term "iPhone repair Minneapolis" and has been live for over a year. It is not even in the top 50. Is this a regional issue or a specific page issue? Our domain actually ranks 15th for this term. http://www.shatterbuggy.com/service-areas/minneapolis/iphone-repair/ Thoughts?
Local SEO | | BenjaminH0 -
Is it necessary to implement hreflang for translated content on different ccTLDs?
Hello there, new MOZ here. I hope someone of the international SEO MOZs can share their opinion on a doubt I have. I've been reading a lot about hreflang and I understand the importance for subdomains and subfolders not only for targeting the same language in different countries (.com, .co.uk, .ca, etc) but also for websites partially or fully translated in other languages. However for these I've always seen examples where you want to have hreflang with subdomains or folders e.g. ru.example.com ; example.com/ru What if I have my translated websites on different ccTLDs - i.e. example.com example.ru. example.br example .fr Do I still need to implement hreflang or in this case is not necessary?
Local SEO | | selectitaly0 -
Have you heard of any white hat methods for influencing Google's auto suggest/auto complete?
We have a client who has a "friend" who says he can get keyword phrases that include their business name to show up in Google's auto suggest when doing a search. We have not heard much about this and are skeptical to these claims as we know how Google comes up with these suggested keyword phrases. Wondering if anyone in the community heard much about tactics to influence auto suggest/auto complete and would like to know your opinion about it.
Local SEO | | CraigSDM0