What can I do to rank higher than low-quality low-content sites?
-
We lost our site in an actual meltdown at our hosting provider in January, and decided to do a new site instead of bring back a dated backup. So we've only been "active" at our URL since about May.
That said, I have not seen any irregular or unexpected penalties. Not showing up is natural if you have literally nothing to show.
We have had a site since then, though, and while it isn't going to win any award, we've built it with best practices using sites like this, trying to use natural, helpful, actual language to convey what we do and why we do it (we're web developers for small business making WordPress sites). Paying attention to titles, keyword frequency and variability, alt tags, etc. Always erring on the conservative side.
While we build sites for people across the country (and a few in places like the UK), we just moved into an actual office space in our hometown so it's never been more important to push our visibility locally.
We've just come back on the scene, in relative terms, so there's no expectation we'll crack the top five or ten; they all have teams of people and bags of capital and have been around many, many years, plus they link to the dozens upon dozens of sites they have done and promote their appearances in press releases and such.
Their content is not bad, and most of it is good and not spammy. They are being genuine.
That said, we're in the late 40s to late 50s right now. Happy to show up at all, but after that first group of legitimate sites, there are
- automatically generated webpages (which I thought couldn't even be listed...one is an MP3 download site that mentions one of the top companies in the page title, and just has a random video on the page)
- local companies touting themselves as SEO "experts" that say things like "Here at Company X, we work hard to bring you the best Rochester, NY web design in the hopes that when you make your Rochester, NY web design decisions, you'll think of us first Rochester, NY web design." I changed the company name and the location, but that's an actual line from their site
- job listings from places like Craigslist and Indeed
- hair stylists
- dentists (?!)
Our code validates, we've incorporated Schema for our addresses, our site is usually fast (650ms to 1.3s in Pingdom from Dallas). We don't do any redirecting, our metas likes everyone else's don't count for ranking but are thoughtfully produced, we pay attention to using concise and accurate URLs without stop words, etc. There are also very very few resources loaded on a given page.
That said, there's not a lot on the blog that's new and all told we have I think 13 total pages including a few posts.
Is it even possible to get close to the actual pack if we, for example, posted more regularly? I was just reading here about how we shouldn't put our links in the site footers of our clients (which we don't always anyway), so I have them only as branded links, only on the homepages, and only on sites that, when crawled, didn't have nonzero spam scores (everyone else has a nofollow link in our portfolio).
I realize this is a super generic question but I wasn't quite sure how to search out this particular use case given that our aspirations are so basic...just trying to figure out if there's something obvious we're missing and shooting ourselves in the foot over.
A thousand pledges of gratitude!
(if this is too common and I just didn't see a duplicate, let me know and I will delete it or ask for it to be deleted....also, I don't want to appear spammy so I am not linking to my site unless it's absolutely necessary...not sure what protocol is...I'm pretty self-aware so I do believe everything I've said above is true).
-
Wow, never caught that before; just saw the similar thing that lists them but doesn't do anything about them down in the crawl section.
Thanks to you as well! Hopefully that'll raise some flags that this stuff needs to not be around anymore when they make another pass at it.
You guys are so helpful and encouraging!
-
If you have many URLs from the old site in the index that are all in the same directory (or a handful of directories) you can quickly and easily remove whole directories of URLs from the index via Google Search Console. We have found it to work very quickly.
-
Go into Search Console, selected ‘Remove URLs’ under ‘Google Index’ in the left hand menu.
-
Add the page or folder you want to remove, and click next. If you add the homepage, that's the same as all pages on the site. If you add a folder you'll get three options under the ‘Reason’ drop down.
One of those options is ‘Remove directory’. Select that.
-
-
1 server error, 1 soft 303, and 5,419 not found. It says it determined those were all pages based on the individual staging site's XML sitemap (which, of course, doesn't exist, which makes it still super confusing).
How do you assign a code to a directory? Is that a similar process to changing folder permissions on FTP?
From what I've read, it doesn't sounds like this will help SEO wise but it sure would give me peace of mind.
At the very least, I have this one thing to tackle that I didn't know about before, so I am going to call that a win and thank you for your time once more!
-
That's good that the backlinks you can find are to new, existing content. Also check the crawl errors in Search Console to see if there are potential 404's that are getting linked.
With regard to the staging content, theoretically you could assign a 410 status code to that directory, thereby telling crawlers that the content does not exist and won't exist ever again. Search Engine Watch has a quick rehash on Google's approach to 404 and 410 codes.
For the remaining urls from the old site, since they don't have current equivalents, a 301 redirect to the home page would be a good move. At least then if someone has an old link to your site, they're getting to something that isn't a 404.
-
The staging sites have not been live for some time. There was no preventative measure if someone had the link, but as I mentioned, I did check off not to be indexed and added them to the root domain's robots.txt. I only gave the link to the client directly, so it's me and them and perhaps a friend or family member who they asked to provide a second pair of eyes in evaluating a given design.
That's part of my concern. They don't exist, and haven't. The apps have been uninstalled in the case of where Installatron was used, and the manually added ones have since been deleted. The folders don't exist, so http://www.myexample.com/staging/ just gives the 404 at the main site, myexample.com, where as previously it would bring up the site located at /staging/.
The pages on my own new site were built from the ground up, so they can't link to the old site in any way; I couldn't tell you for sure what the old page addresses even were except to say I know they're different because they're based off of the unique page titles (in this case).
As far as the staging sites, all the links were internal, just pointing to other pages within that same staging site. I've been through the live sites and all the changing of addresses between domains was clean, including changing the character counts for serialized arrays in the database. And no links have ever been published that pointed to the old site at any given time.
So it makes me think that, when they did exist, Google ignored the checked box in the settings and indexed them thoroughly; their own internal links would suffice, then, to display so many results, but why after a year would they not be cleaned from results when they no longer exist?
Search Console only lists four addresses being linked to, and they're all from the new site, and resultant from my old footer links (except for one place in Taiwan selling hotel reservations...go figure).
Open site shows all the expected footer links for the external links, but perplexingly, has a record of lots of internal links from the old site (even test pages, like checkout pages for shopping cart plug-ins).
Since there aren't any equivalents, is my next move to somehow make a rule that anything beyond the few pages I deliberately made and am aware of get 301'd to the home page? Would that be an effective method of scrubbing the results of ghosts?
Thanks again for your time and patience. If you can't or don't wish to get back with me, I understand, I just am very persistent when something I am not used to is exposed to me and it rolls around in my head and saturates my brain
-
Is the staging site still live & accessible to anyone who has the link? If so, I recommend either shutting it down or password protect it, possibly via htaccess.
Regarding old site to new site mapping, there are likely pages linking to old content that are now resulting in 404 errors. That's going to be where most of the "juice" is. Take a look at the results in Open Site Explorer, as well as Google Search Console's links to your site section, and see where your current backlinks are pointing. If there are equivalents on the new site, having a 301 redirect from an old URL to the new equivalent will help guide visitors to the right content. If there's no equivalent, redirecting to the home page works in a pinch.
-
Thanks for taking the time to respond! It sincerely means a lot.
All of the addresses are new this time around, and none have a precedent from the old site (in other words, we have a page discussing our graphic design services, but those weren't available before, so there's no bridge to be made).
I did the site:domain thing and was floored, though...We do staging versions of all our development sites right on our domain, and are careful never to publish the addresses in electronic form, even in private messages. Also, in using Wordpress, while it says it's up to the search engine to decide to honor it or not, we click the checkbox that says not to index it.
I even have all of the subdirectories wildcarded in my robots.txt file...
...and yet there are tons and tons of addresses up to a year and a half old.
I don't understand how with each new pass the crawler takes, and having a new sitemap, why those would get indexed; those staging sites don't even exist anymore.
Are those diluting my "juice?" How can I remove them? I see a removal tool that Google offers, but it seems they want you to tread lightly and I'm already taking a lot of the preventative measures they recommend.
Thanks again for your help. It's sort of embarrassing to not know these things, but I'm never going to get better at the things I have knowledge gaps in if I don't ask questions.
-
Is the new site at the same domain as the previous one? If so, I'd get a hold of a sitemap from the older version of the site, plus the results of a "site:domain" search to make sure there are proper 301 redirects from legacy pages to new URL structures.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New business / content marketing
Hi all SEO experts, if a website is brand new, so published in the last 3 months- new domain name and website design. We have rebranded recently, using a new domain as entered new business partnership, there doesn’t seem to be much guidance on this at all, from various SEO websites, so our question is would you delay publishing new blog posts / content marketing as frequently because the company website is brand new? So would SEO’s decrease the frequency of publication of blog posts, because the website is new? Or perhaps it does not matter, and would still post every week as you would if the website has been live for a long time? So, in nutshell, what we are wondering is, is the “Google Sandbox” still in use?
Local SEO | | Ryan070 -
What Causes Large Swings in Local Rankings?
I know local rankings are a complicated matter and I'm not looking for a single answer to this question, but I'm curious if any local SEOs have noticed similar issues to what I'm experiencing with trying to rank a multi-location-based business. Overall, the visibility trends for the business are up, but we keep popping into the top three spots (happened 2-3 times over the past year) for some general, particularly high-volume search terms only to fall back out and settle a week later into placements below the first page. This is particularly frustrating because the terms we're seeing this volatility for are the exact dream keywords we're hoping to rank the site for. Has anyone else experienced the same thing and had specific findings about what was at play? Is Google testing us and finding us unworthy? Any and all insights from pros with similar experiences would be helpful!
Local SEO | | formandfunctionagency0 -
We are adding an ecommerce feature to our site. noindex the order. subdomain?
our site currently consists of directory listings for different stores but we will now be adding an ecommerce feature to our site. people from the main site will be able to click a button that will direct you to the orders. subdomain. we are thinking about noindexing the subdomain as i can't find any use cases in organic searches for this new orders. subdomain. What is the current best practice for this type of situation and will noindexing the orders. subdomain harm us in anyway?
Local SEO | | imjonny1230 -
301 redirect from OLDEST site to OLD site to a NEW site. Cons, pros, how?
Local business had a site on domain name - (A) for a 5 years. Few years ago they moved to a new domain - (B) and did 301 redirect from A to B. Now they want to move to another domain containing a keyword - (C+kw).com and apply 301 Question:
Local SEO | | Ryan_V
How to proceed with the redirect for a C+kw not to loose ranking? Which option is better?
1. Redirect from the oldest domain (A) to a newest (C)
A>301>C 2. Redirect from existing domain (B) to a newest (C)
A>301>B
B>301>C 3. Stop existing redirect from A to B, instead do two redirects to a new domain (C)
A stop 301 to B
A>301>C
B>301>C As far as I know under the same conditions a new domain will rank worse than an aged domain. On the other part keyword in domain name helps with local SEO. I think that for the long run it's ok to loose some traffic for a few months but have a better chances to rank in future. What do you think guys?0 -
From traction to non existent! What happened to my Photography site and what can I do to fix it?
Aloha guys, To start as I always do with the (awesome) Moz community I wanted to say thanks for the insight! This has to be one of the best online communities and help resource with great positive and concise help that really makes a difference, so many thanks everyone! PS I also do my best to relay what I learn here to fellow business owners and point them to SEO boosting avenues to help support the community as much as possible. Anyways... **My Photo website ** **Current top wedding website (I do enjoy her work!!!) ** Attached below is a link to some stats/graphs! The Problem! After the recent Google update last month I've had a drop in my site visibility from 5.8% and some change to now .7% of search volume.. Painful for my photo & video business here on Kauai to say the least. A few images are attached, is there also any correlations you guys can see or think may help to get my site up to the first page? I know we deliver some of the very best work here on the island and deliver great service too, its a bummer that we cant do more for folks visiting here that dont even know we exist! The question! Do you guys have any ideas on what can be done to get my page to gaining organic traction and doing great again? My goal is to have our business rank for Kauai Wedding Videographer, Kauai Wedding Photographer, and Kauai Family Photographer! My moz dashboard is still saying we're on the way for that but that my search visibility is way way down. Any clarity or ideas are greatly appreciated you guys! I would love to relay this to the wedding community as well! Warmest aloha from Kauai everybody and have a great day! NjELT NjELT
Local SEO | | Trey30 -
Trying to rank homepage nationally and internal pages locally?
We are a finance brokerage in Australia and we operate in a specialist niche and in regional areas with low competition but we have identified KW's that are very profitable to us but seem to need different approach re strategy. We specialise in Agribusiness lending. We have been pretty scrappy in the past with our SEO as it has always been done by me, and as a startup, as everyone knows, the jack of all trades can help and hinder! To date, we have done a lot of Adwords (and KW research) so I have a fair idea of what keywords I am after. Some KW are low competition and extremely profitable to us. But there is a difference between them on who our competitor is and how difficult it would be to rank and which strategy to use. For example Agribusiness, used by all major banks, now they provide agribusiness, but only via their own products, as we are brokers we tend to receive a lot of new leads as we are brokers and we can compare all products and as agribusiness can be quite complex this is a major point of difference for us. So my strategy to rank for this KW would include a national approach as we provide advice in this space on a national scale, which has worked well via AdWords leads. But would like to move away from my sole reliance on AdWords. Then we move onto KW that we have also had some success on a national scale via Adwords but the metrics suggest is better from a local perspective (local regional town), i.e hobby farm loan, rural finance, even home loans (when there is no other local competitor in small town). As we have brokers in other regional towns this also opens up an opportunity to have either internal pages with lots of local signals (i.e NAP, Authority outbound links, local KW, social signals from local FB groups etc). But can a internal page compete against a competitors HP, for example I was going to set up mysite/Toowoomba.com.au internal page with info re that broker and lots of local points, or am I best to create another site, i.e brandname-Toowoomba.com.au (still linking from my contact us page for Toowoomba) and focus solely on local for this site (including internal pages to rank locally, i.e Toowoomba Home loans)? the extra benefit is I then create another asset if I was to sell the region as a franchise (another discussion) So, my question is, can I mix my strategies without any issues, or should I create separate sites?
Local SEO | | AgLend0 -
Duplicate content across a number of websites.
We have a client who has approximately 25 retail sites (mini department stores) selling in general the same merchandise ranges - some stores carry all the ranges (brands) while others have fewer due to space restrictions. Each destination is different has its own branding and unique selling point which needs to be reflected. The client wants us to build individual websites for each location as they want to promote each location individually. I know that the search engines don't penalise duplicate content, but the core of each website is going to be essentially the same. My concern is there is no way you could write 25 different pages about the same Colony Candle range! Any ideas suggestions would be much appreciated - a one site option would not work as the client wants individual website and due to the different branding, USP and the fact they want to market them individually I would agree with them. Thanks Fraser
Local SEO | | fraserhannah0 -
National and Local rankings differences
Hi Guys,
Local SEO | | nikaus
I am in Australia and have a client I am working with that ranks quite well for their main keywords.
The business is based in Sydney but delivers Australia wide. The issue is - their main keyword ranks no.1 nationally.
If I set google to any of the main cities and type
Outdoor Mirrors Perth
Outdoor Mirrors Melbourne
Outdoor Mirrors Sydney
My client comes up no.1 But if I leave the city off the end of the keyword and I'm anywhere but Sydney I do not rank well for Outdoor Mirrors. The address of the business is in Sydney hence the no.1 there, but we said in Places that we deliver australia wide. Does anybody know why we don't rank well without the city added to the keyword and a way to remedy this? Thanks Nik0