How to re-rank an established website with new content
-
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources.To start I'll provide my situation;
SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps;- Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links
- 'Disavowed' the rest of the unnatural links that we couldn't have removed manually.
- Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest)
- Redesigned the entire site with speed, simplicity and accessibility in mind.
- Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure.
- Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing?
- Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages - Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout.
- Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines.
Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example.Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time.
What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules?Would appreciate some professional opinion or from anyone who may have experience with a similar process before.
It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry.
Many, many thanks in advance.
Ryan.
-
Many, for pure backlinks check the most comprehensive are ahrefs.com and https://majestic.com
-
Something does seem wrong, that's what I thought.
The 20,000 links was from our development site, it should never have been indexed. It was taken down (the site) the same week so we should hope any penalty shouldn't stay for long.
8th September seems fishy also, we've certainly not done that ourselves. Is there any way to check these links? Any tool?
Thanks in advance.
-
I think you need an in deep analysis.
There's something definitely very wrong. I can see only 13 keywords, with a backlink profile of more than 500 linking root domains. Your traffic seems to have been in constant decline for a while but in May something happen which sort of killed it completely.
Also looks like you gained around 15/20 thousands links between Oct 9 and 15, that's smelling.
On the 8th of Sep you got 150 root linking domain in one day, wow, that's smelling even more.
-
Hi Max,
Thanks for the response.
There was no manual penalty at any point, and there still aren't any showing in WMT.
We're probably only ranking for perhaps 5-10 keywords, and most of them have no competition or are branded. There are a few local long-tail keywords we get traffic from still, such as 'Children's Entertainer in Wembley' and others similar.
This is why I thought of coming to the experts at Moz, it seems fairly strange that BEFORE the algorithm penalty (if indeed there was one) we were ranking fairly well for our industry keywords (think Children's Entertainers, Dancers, Clowns, etc etc) and were probably ranking for over 100 keywords easily.
Since disavowing, link auditing, and removing clearly spammy content + **then **adding new rich content, we're still only ranking for pretty much no keywords after about a month.
As far as I can guess, we've either not been indexed/ranked yet (which seems odd as we used to be indexed fairly regularly) or there's something else going on.
Thanks again for the response.
-
Is google WMT showing any manual penalty? And as far as I can see from a quick look you seem to be indexed for a very very limited number of keywords, how many keywords are originating traffic if you look at WMT?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Never ending new links and our rank continues to plumet
HI everyone, I've been having an issue with a severe drop in rankings (#2 to #36ish). All of my technicals seem to be ok, however I seem to be getting my images hotlinked (which I have killed in nginx) from these spam like pages that pull and link to an image on my site, then link again with a " . " for the anchor. Even more strange is that these pages are titled and marked up with the same titles and target key words as my site. For example, I just got a link yesterday from a site leadoptimiser - d o tt- me which is IMO a junk site. The title of the page is the same as one of my pages, the page is pulling in images relevant to my page, however the image sources are repos EXCEPT for 2 images from my site which are hotlinked to my pages image and then an additional <a>.</a> link is placed to my website. I have gotten over 1500 of these links in the past few months from all different domains but the website (layout etc) is always the same. I have been slowly disavowing some of them, but do not want to screw up anything in case these links are already being discounted by G as spam and not affecting my rank. The community seems to be really split on the necessity of disavowing links like these. Because of these links, according to Ahrefs, my backlink profile is 38% anchor text of "." . Everything else checks out in my own review as well as Moz tools and Ahrefs with very high quality scores etc. Webmasters is fine, indexing is fine, pagespeed insights is in the 90's, ssl is A+. I've never had to deal with what seems to be an attack of this size. Thanks.
White Hat / Black Hat SEO | | plahpoy1 -
How to save website from Negative SEO?
Hi, I have read couple of good blog post on Negative SEO and come to know about few solution which may help me to save my website during Negative SEO. Here, I want to share my experience and live data regarding Negative SEO. Someone is creating bad inbound links to my website. I come to know about it via Google webmaster tools. Honestly, I have implemented certain solutions like Google disavow tool, contact to certain websites and many more. But, I can see negative impact on organic visits. Organic visits are going down since last two months. And, I am thinking, These bad inbound links are biggest reasons behind it. You can visit following URLs to know more about it. Can anyone share your experience to save website from negative SEO? How can I save any website from Negative SEO (~Bad Inbound Links) https://docs.google.com/file/d/0BxyEDFdgDN-iR0xMd2FHeVlzYVU/edit https://drive.google.com/file/d/0BxyEDFdgDN-iMEtneXU1YmhWX2s/edit?usp=sharing https://drive.google.com/file/d/0BxyEDFdgDN-iSzNXdEJRdVJJVGM/edit?usp=sharing
White Hat / Black Hat SEO | | CommercePundit0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Hiding content or links in responsive design
Hi, I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that. Google says:
White Hat / Black Hat SEO | | NurunMTL
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/details For usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none") Is this counted as hidden content and could penalize your site or not? What do you guys do when you create responsive design websites? Thanks! GaB0 -
Penguin Maybe? Ranking low for main term: Trying to find cause and correct
Hello, For nlpca(dot)com one of our main keywords is the term "NLP" We are ranking 25th for that term.Possible causes: 1. keyword stuffing on home page, though we need to use the term over and over again to describe ourselves. Also, competitors like nlpco(dot)com and nlpu(dot)com also mention "NLP" a lot 2. Backlink profile: see this spreadsheet. We have a lot of sites from other countries and many sitewides but all natural and almost all branded. Ou company names are NLP Institute of California, NLP California, and NLP and Coaching Institute. 3. nlpcacoach(dot)org is a sitewide footer link. So is iepdoc.nl. We're going to ask the first site to take our link down. 4. No "What is NLP" article. I think that might help. 5. Most of our 60 articles are posted on other sites. We author about 30 of them. I'm working on authorship via rel="author" and rel="me" links. There's usually 2 authors 6. Most of the title tags used to be 4 keywords separated by pipes -"|" I changed them all after the updates took the keyword "NLP" down. That's about all I can think of. What do we do or clean up?
White Hat / Black Hat SEO | | BobGW0 -
What do you think of our new category page?
Hey Mozzers! We have come up with a new layout design for a category page and would love to have your opinion on it, specifically from an S_E_O perspective Here is our current page: http://www.builddirect.com/Laminate-Flooring.aspx Our new page (pending approval): http://www.builddirect.com/testing/laminate-flooring/index.html Just to brief you in on the key differences b/w old and new layout: Left text link menu is removed in new layout
White Hat / Black Hat SEO | | Syed1
New layout looks funny with JS disabled - long vertical line up of products(Perhaps important keywords/ content in new layout appears way down?)
Lot of 'clunk' has been removed (bits of text, links, images, etc) Thanks for checking this out.0 -
Is it negative to put a backlink into the footer's website of our clients ?
Hello there ! Everything is in the subject of this post but here is the context : we are a web agency and we, among others, build websites for our clients (most of them are shops). Until now, we put a link in their footer, like "developped by MyWebShop". But we don't know if it is bad or not. With only one website we can have like hundred of backlinks at once, but is it good for SEO or not ? Will Google penalize us thinking that is blackhat practices ? Is it better to put our link in the "legal notices" or "disclaimer" part of the websites ? What is the best practice for a lasting SEO ? I hope you understand my question, Thnak you in advance !
White Hat / Black Hat SEO | | mywebshop0 -
Does your website get downgraded if you link to a lower quality site?
My site has a pr of 4. My friends site has a pr of 2 but I think that he is doing some black hat seo techniques. I wanted to know whether the search engines would ding me for linking to (i.e., validating) a lower quality site.
White Hat / Black Hat SEO | | jamesjd70