What can I do to rank higher than low-quality low-content sites?
-
We lost our site in an actual meltdown at our hosting provider in January, and decided to do a new site instead of bring back a dated backup. So we've only been "active" at our URL since about May.
That said, I have not seen any irregular or unexpected penalties. Not showing up is natural if you have literally nothing to show.
We have had a site since then, though, and while it isn't going to win any award, we've built it with best practices using sites like this, trying to use natural, helpful, actual language to convey what we do and why we do it (we're web developers for small business making WordPress sites). Paying attention to titles, keyword frequency and variability, alt tags, etc. Always erring on the conservative side.
While we build sites for people across the country (and a few in places like the UK), we just moved into an actual office space in our hometown so it's never been more important to push our visibility locally.
We've just come back on the scene, in relative terms, so there's no expectation we'll crack the top five or ten; they all have teams of people and bags of capital and have been around many, many years, plus they link to the dozens upon dozens of sites they have done and promote their appearances in press releases and such.
Their content is not bad, and most of it is good and not spammy. They are being genuine.
That said, we're in the late 40s to late 50s right now. Happy to show up at all, but after that first group of legitimate sites, there are
- automatically generated webpages (which I thought couldn't even be listed...one is an MP3 download site that mentions one of the top companies in the page title, and just has a random video on the page)
- local companies touting themselves as SEO "experts" that say things like "Here at Company X, we work hard to bring you the best Rochester, NY web design in the hopes that when you make your Rochester, NY web design decisions, you'll think of us first Rochester, NY web design." I changed the company name and the location, but that's an actual line from their site
- job listings from places like Craigslist and Indeed
- hair stylists
- dentists (?!)
Our code validates, we've incorporated Schema for our addresses, our site is usually fast (650ms to 1.3s in Pingdom from Dallas). We don't do any redirecting, our metas likes everyone else's don't count for ranking but are thoughtfully produced, we pay attention to using concise and accurate URLs without stop words, etc. There are also very very few resources loaded on a given page.
That said, there's not a lot on the blog that's new and all told we have I think 13 total pages including a few posts.
Is it even possible to get close to the actual pack if we, for example, posted more regularly? I was just reading here about how we shouldn't put our links in the site footers of our clients (which we don't always anyway), so I have them only as branded links, only on the homepages, and only on sites that, when crawled, didn't have nonzero spam scores (everyone else has a nofollow link in our portfolio).
I realize this is a super generic question but I wasn't quite sure how to search out this particular use case given that our aspirations are so basic...just trying to figure out if there's something obvious we're missing and shooting ourselves in the foot over.
A thousand pledges of gratitude!
(if this is too common and I just didn't see a duplicate, let me know and I will delete it or ask for it to be deleted....also, I don't want to appear spammy so I am not linking to my site unless it's absolutely necessary...not sure what protocol is...I'm pretty self-aware so I do believe everything I've said above is true).
-
Wow, never caught that before; just saw the similar thing that lists them but doesn't do anything about them down in the crawl section.
Thanks to you as well! Hopefully that'll raise some flags that this stuff needs to not be around anymore when they make another pass at it.
You guys are so helpful and encouraging!
-
If you have many URLs from the old site in the index that are all in the same directory (or a handful of directories) you can quickly and easily remove whole directories of URLs from the index via Google Search Console. We have found it to work very quickly.
-
Go into Search Console, selected ‘Remove URLs’ under ‘Google Index’ in the left hand menu.
-
Add the page or folder you want to remove, and click next. If you add the homepage, that's the same as all pages on the site. If you add a folder you'll get three options under the ‘Reason’ drop down.
One of those options is ‘Remove directory’. Select that.
-
-
1 server error, 1 soft 303, and 5,419 not found. It says it determined those were all pages based on the individual staging site's XML sitemap (which, of course, doesn't exist, which makes it still super confusing).
How do you assign a code to a directory? Is that a similar process to changing folder permissions on FTP?
From what I've read, it doesn't sounds like this will help SEO wise but it sure would give me peace of mind.
At the very least, I have this one thing to tackle that I didn't know about before, so I am going to call that a win and thank you for your time once more!
-
That's good that the backlinks you can find are to new, existing content. Also check the crawl errors in Search Console to see if there are potential 404's that are getting linked.
With regard to the staging content, theoretically you could assign a 410 status code to that directory, thereby telling crawlers that the content does not exist and won't exist ever again. Search Engine Watch has a quick rehash on Google's approach to 404 and 410 codes.
For the remaining urls from the old site, since they don't have current equivalents, a 301 redirect to the home page would be a good move. At least then if someone has an old link to your site, they're getting to something that isn't a 404.
-
The staging sites have not been live for some time. There was no preventative measure if someone had the link, but as I mentioned, I did check off not to be indexed and added them to the root domain's robots.txt. I only gave the link to the client directly, so it's me and them and perhaps a friend or family member who they asked to provide a second pair of eyes in evaluating a given design.
That's part of my concern. They don't exist, and haven't. The apps have been uninstalled in the case of where Installatron was used, and the manually added ones have since been deleted. The folders don't exist, so http://www.myexample.com/staging/ just gives the 404 at the main site, myexample.com, where as previously it would bring up the site located at /staging/.
The pages on my own new site were built from the ground up, so they can't link to the old site in any way; I couldn't tell you for sure what the old page addresses even were except to say I know they're different because they're based off of the unique page titles (in this case).
As far as the staging sites, all the links were internal, just pointing to other pages within that same staging site. I've been through the live sites and all the changing of addresses between domains was clean, including changing the character counts for serialized arrays in the database. And no links have ever been published that pointed to the old site at any given time.
So it makes me think that, when they did exist, Google ignored the checked box in the settings and indexed them thoroughly; their own internal links would suffice, then, to display so many results, but why after a year would they not be cleaned from results when they no longer exist?
Search Console only lists four addresses being linked to, and they're all from the new site, and resultant from my old footer links (except for one place in Taiwan selling hotel reservations...go figure).
Open site shows all the expected footer links for the external links, but perplexingly, has a record of lots of internal links from the old site (even test pages, like checkout pages for shopping cart plug-ins).
Since there aren't any equivalents, is my next move to somehow make a rule that anything beyond the few pages I deliberately made and am aware of get 301'd to the home page? Would that be an effective method of scrubbing the results of ghosts?
Thanks again for your time and patience. If you can't or don't wish to get back with me, I understand, I just am very persistent when something I am not used to is exposed to me and it rolls around in my head and saturates my brain
-
Is the staging site still live & accessible to anyone who has the link? If so, I recommend either shutting it down or password protect it, possibly via htaccess.
Regarding old site to new site mapping, there are likely pages linking to old content that are now resulting in 404 errors. That's going to be where most of the "juice" is. Take a look at the results in Open Site Explorer, as well as Google Search Console's links to your site section, and see where your current backlinks are pointing. If there are equivalents on the new site, having a 301 redirect from an old URL to the new equivalent will help guide visitors to the right content. If there's no equivalent, redirecting to the home page works in a pinch.
-
Thanks for taking the time to respond! It sincerely means a lot.
All of the addresses are new this time around, and none have a precedent from the old site (in other words, we have a page discussing our graphic design services, but those weren't available before, so there's no bridge to be made).
I did the site:domain thing and was floored, though...We do staging versions of all our development sites right on our domain, and are careful never to publish the addresses in electronic form, even in private messages. Also, in using Wordpress, while it says it's up to the search engine to decide to honor it or not, we click the checkbox that says not to index it.
I even have all of the subdirectories wildcarded in my robots.txt file...
...and yet there are tons and tons of addresses up to a year and a half old.
I don't understand how with each new pass the crawler takes, and having a new sitemap, why those would get indexed; those staging sites don't even exist anymore.
Are those diluting my "juice?" How can I remove them? I see a removal tool that Google offers, but it seems they want you to tread lightly and I'm already taking a lot of the preventative measures they recommend.
Thanks again for your help. It's sort of embarrassing to not know these things, but I'm never going to get better at the things I have knowledge gaps in if I don't ask questions.
-
Is the new site at the same domain as the previous one? If so, I'd get a hold of a sitemap from the older version of the site, plus the results of a "site:domain" search to make sure there are proper 301 redirects from legacy pages to new URL structures.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword rich domain names -> Point to sales funnel sites or to landing pages on primary domain?
Hey everyone,
Local SEO | | Transpera
We have a tonne of old domains we have done nothing with. All of them are keyword-rich domains.
Things like "[City]SEOPro" or "[City]DigitalMarketing" where [city] is a city that we are already targeting services in. So all of these domains will be targeted for local cities as keywords. We have been having an internal debate about whether or not we should just host sales funnel pages on these domains, that are rich in keywords and content......... ... Or ... ... Should we point these domains to landing pages on our existing domain that are basically the same as what we would do with the sales funnel pages, but are on our primary site? (keyword rich, with good and plentiful content) Then, as a follow-up question... Should these be set as just 301 redirects on these domains to our actual primary domain so the browser sees the landing page domain instead of the actual keyword-rich domain? ( [city]seopro.com ) Thanks guys. I know for some, the response will be an obvious one. However; we have probably way over thought this and have arguments for almost every scenario. We think we have an answer but wanted to send this out to the community first. I won't post what we are thinking yet, so that the answers can remain unbiased for now and we can have a conversation without it being swayed any one way. We understand that 301 redirects would be seen as a doorway page.
We are also only discussing in the context of organic search only.
If we ran the domains as their own sites, they would be about 3 pages of content only. Pretty static, but good content. Think of a PAS style sales funnel. Problem -> Acknowledgement -> Solution.0 -
How can I make a compelling financial forecast on how SEO will bring tangible value?
Hi all, I am currently doing keywords research and matching it to our sales data. With an input of resources, do you know if there is a good way to create a growth forecast as a result of SEO investment? How do I work out a strategy and align it to a compelling financial forecast? Your experience on this will be super helpful! Many thanks, Eric
Local SEO | | Eric_S1 -
Is dynamic keyword insertion a viable local SEO tactic/strategy for your content?
Hi mozzers, I have a meeting tomorrow with the dev team to discuss about dynamic keyword insertion implementation on a new site. This site currently holds 40 geo specific microsites with several service pages each carrying unique content. These pages(about 400 pages) are seen by VP of marketing as hard to maintain and inconvenient when wanting to change content across these pages. The VP is looking to automate content as much as possible without hurting our local SEO efforts. The dev team will be asking me if dynamic keyword insertion could a viable strategy for these 40 locations without harming local SEO. Currently we have a robust local SEO strategy in place and wouldn't want to change it unless dynamic keyword insertion is a viable option and won't hurt all the seo efforts that are in place? If this is not a viable solution, any recommendations on any other solutions we could use to satisfy the VP? If you have used DKI for your local SEO efforts, please share your thoughts and results that you have seen. Any real case scenario data/knowledge would be really helpful. Thank you!
Local SEO | | Ideas-Money-Art0 -
Will I get Penelized for having a .co.uk site AND a .com site?
Hi Mozers, I have a very important pitch coming up which needs to tackle a questions about international SEO. My client currently has a .com website, but we are debating internally about creating a .co.uk website too so that we can localise content for the UK versus American English on our .com site. Currently, our clients proposition is global, so we made the decision to create a .com website but using American English spelling as a large chunk of English speakers in the world use American English over British English. However, we want to grow the business within the UK, and therefore want to use British English language. Hence creating a .co.uk website. Now, my question is this.... the new .co.uk website will be identical content as the .com website, except for a few spelling changes and the way we phrase certain sentences. How would we be able to run both a .co.uk site and .com site without being penelized from Google for plagarism? Would it involve href lang tags? Server hosting location? Any ideas from you guys out there?
Local SEO | | Virginia-Girtz0 -
Category pages are treated as duplicate content - is that a problem?
Hi there I have analyzing a webshop where we sell products for pets, gardening and the like. I am getting a lot of "Duplicate Content" alerts from Moz when doing a site crawl and I am told that the pages for e.g. cat products and gardening tools show duplicate content. Those two pages contain no identical products, so I am guessing that it is just the "set up" of the page (they look almost identical, except for the products). My question is: Is this really a problem? Does it affect my ranking in a negative way, and if so, how can I counter it? Best regards Frederik
Local SEO | | fhertzp0 -
Does anyone have stats or know where I can find stats on searchers who use geolocated queries versus geomodified?
My client is a franchise business and they want their location landing pages to rank for every one of their 60 plus locations nationwide. They are performing extremely well for geomodified terms. The argument is that people rarely ever search using the city name. Are there stats to back up whether this claim is true, and if so, do you know where I can get a hold of such data (outside of searching in Keyword Planner... unless that's the answer!)
Local SEO | | Treefrog_SEO0 -
Transfer Local SEO rankings to another domain
The question is specifically about local rankings, not the organic ones. My client recently acquired another Law firm. Acquired firm's website is ranking well in Google local and has a decent SEO authority. Its Google mybusiness page is also established and has a lot of positive reviews. Client's main website is comparatively new and doesn't currently rank well in Google local. The Google mybusiness page is sort of incomplete and doesn't have any review. Both businesses are listed in local directories (client's main business is listed in lot less directories and has fewer citations). The client wants to merge the newly acquired website with his main website, without losing Google local rankings the acquired website has. Or in other words, transfer newly acquired website's local rankings to his main site. Client wants to transfer the website to his main website in all cases while minimizing the damage. I'd transfer acquired website's content to main website, properly map the pages and place 301 redirects. Regarding Google my business pages, what would you suggest? I can either update main business NAP and Website address in Acquired business's mybusiness page, or transfer acquired business's mybusiness ratings to main mybusiness page via this form: https://support.google.com/business/contact/business_move_reviews I've also heard that Google support can merge two business page, however not sure about that. I'd also need to update the business listings and citations. Could you please suggest the best way of doing this? And have you practically tested it?
Local SEO | | Woofire0 -
Content Across International Websites
I am wondering if anyone could clear up some questions I have regarding international SEO and how to treat the content placed on there. I have recently launched several websites for a product internationally, each with the correct country domain name etc and I have also followed the guidelines provided by webmaster tools on internationalisation. All the websites are targeted towards English speaking countries and I have rewritten most of the of the content on there to suite the English style of the targeted country. This is being said however I am finding mixed bags of information on what to do in treating large chunks of potential duplicate content. For example my main .com website which has been running several years (and is targeted to the UK) has a lot of well written articles on there which are popular with the visitors. I am needing to find out if duplicating these articles onto the international versions of the websites, without rewriting them, would have a detrimental effect on SEO between all the sites. I have done a site search for each domain name to see if they are cropping up in other local Google versions (e.g .ca site in Google.com.au etc) and they are not. Does this mean Google is localised to its results regarding duplicate content or is it treated at the root level? Any information to point me in the right direction would be a big help.
Local SEO | | Rj-Media0