My pages are absolutely plummeting. HELP!
-
Hi all,
Several of my pages have absolutely tanked in the past fortnight, and I've no idea why. One of them, according to Moz, has a Page Optimisation Score of 96, and it's dropped from 10th to 20th. Our DA is lower than our competitors, but still, that's a substantial drop. Sadly, this has been replicated across the site.
Any suggestions?
Cheers,
Rhys
-
Thanks, Stevie. Apparently, there's been a 'phantom' update which I suspect is the culprit...
-
Oh, and I just spotted that you have the same text in 2 x H1's and the title, It just might make a difference if you either change the second H1 to a variant if possible, and make it an H2 OR probably better, just lose the h1 tag on the logo completely. You never know
I'd also be inclined to make the title longer, you've plenty of space for a few related keywords or a call to action
-
Hi Rhys,
I'll be honest with you, I usually deal with relatively small businesses, so am probably not the best person for this one. It's mobile friendly (responsive), so that's not the issue. It doesn't have SSL, so that could be a factor, I'm sure I remember Google saying they'd be giving a boost to secure sites early this year.
Yours is an authority site in itself, so I doubt a few links either way would make a diffference either. Theres also plenty of text which I assume is unique. I appreciate that you may not have control over the whole site, so could it be that the general site navigation changed somewhere further up and inadvertantly pushed you deeper into the site as a whole (i.e. more clicks from the homepage)?
-
Hi Stevie,
This is the course page in question, but this is only one of a number of pages that have dropped.
It's not great in mobile, but speed isn't an issue, but more to do with how it reacts to mobile.
http://www.swansea.ac.uk/undergraduate/courses/medicine/bscbiochemistry/
But this is the page which has an optimisation score of 96. Not really sure what else I can do.
Our domain did drop by 2 in the latest update a fortnight ago, but then again, so did many of our competitors.
Cheers,
Rhys
-
Hi Rhys,
Without your url it's impossible to take an educated guess I'm afraid. Google are constantly tweaking things so if it's only Google not any other SE's it could well be an algorithm adjustment, assuming you've not changed anything yourself. You can check that in your Moz account if you're tracking any other SE's (I always keep a non Google one in there myself for that exact reason).
If your site doesn't have SSL and the sites that rose above you generally do, perhaps it was that, or if not mobile friendly, they could have put more emphasis on it? My blind guess is they're the two likely culprits at the moment. But it could be something as simple as you dropped a high authority link, if there weren't many to start with, or one of soooooo many other factors.
-
Hi Stevie,
Thanks for the link; I don't think it's that because we don't have any pop-ups or ads. Individual pages have just plummeted, one dropped 81% in a week. Any other ideas? There's not much left for me to optimise, so I don't see how I can regain my first-page positions.
Cheers,
Rhys
-
Complete stab in the dark not knowing what your url is, but there was an update around that time, see if it applies to you: https://moz.com/google-algorithm-change
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help choosing ideal URL structure
Hi All, We are considering changing the link structure for the website of a large restaurant group, which represents about 100 restaurants in the USA. While I have some opinions, I'd very much welcome the opinions of some other seasoned SEO's as well. There are two options on the table for the link structure, which you can see below. The question is for restaurants with multiple locations, and how we structure those URLs. The main difference is whether we include the "/location/" of the URL, or if that is overkill? I suppose maybe it could have some value if someone is searching a term like "Bub City Location", with "location" right in the search. But otherwise, it just adds to the length of the URL, and I'm not sure if it'll bring any extra value... In this example, "bub-city" is the restaurant name, and "mb-financial-park" is one of the locations. Option A
Local Website Optimization | | SMQ
http://leye.local/restaurant/bub-city
http://leye.local/restaurant/bub-city/location/mb-financial-park/ Option B
http://leye.local/restaurant/bub-city
http://leye.local/restaurant/bub-city/mb-financial-park/ Thoughts?0 -
Which URL and rel=canonical structure to use for location based product inventory pages?
I am working on an automotive retailer site that displays local car inventory in nearby dealerships based on location. Within the site, a zip code is required to search, and the car inventory is displayed in a typical product list that can be filtered and sorted by the searcher to fit the searchers needs. We would like to structure these product inventory list pages that are based on location to give the best chance at ranking, if not now, further down the road when we have built up more authority to compete with the big dogs in SERP like AutoTrader.com, TrueCar.com, etc. These higher authority sites are able to rank their location based car inventory pages on the first page consistently across all makes and models. For example, searching the term "new nissan rogue" in the Los Angeles, CA area returns a few location based inventory pages on page 1. The sites in the industry that are able to rank their inventory pages will display a relatively clean looking URL with no redirect that still displays the local inventory like this in the SERP:
Local Website Optimization | | tdastru
https://www.autotrader.com/cars-for-sale/New+Cars/Nissan/Rogue
but almost always use a rel=canonical tag within the page to a page with a location parameter attached to the end of the URL like this one:
https://www.autotrader.com/cars-for-sale/New+Cars/Nissan/Rogue/Los+Angeles+CA-90001"/>
I'm having a hard time figuring out why sites like this example have their URLs and pages structured this way. What would be the best practice for structuring the URL and rel=canonical tags to be able to rank for and display location based inventory pages for cars near the searcher?0 -
Landing page, or redirect? Looking for feedback.
If we have a section of our site that we have branded separately from the rest of the site, does it make sense to provide a landing page on our current, high authority site that has content and links off to the separate site, or would just a domain.com/keyword redirect to the page be a better route? Does it matter? I have an idea, but I'd like to get feedback on this. We are a newspaper, http://billingsgazette.com and we have an auto branded site called http://montanawheelsforyou.com. The URL and branding is fubar. We're wondering if we can increase the ranking if we swapped out the http://billingsgazette.com/autos from a redirect to http://montanawheelsforyou.com to a landing page with content and a link to http://montanawheelsforyou.com.
Local Website Optimization | | rachaelpracht0 -
All metrics appear to be better than our local competitors yet we our ranking doesn't resemble it. Help?
Hi, I work for a marquee company and have recently been really trying to optimise our SEO through good content, link building, social media especially google + and so on. Yet a rival (www.camelotmarquees.com) who performs worse than us for the majority of the moz parameters still ranks better than us in both organic search and google places. The clear and obvious factor they beat us on is internal links which is currently over 15,000 which seems ridiculous for the size of their site, compared to our site of about 120. Would this have that match of an effect on the rankings and how on earth have they got so many? Also is there any tips and advice to help us leap frog them as we feel, we're producing regular, useful content and optimised our site the best we can? website: www.oakleafmarquees.co.uk keywords: marquee hire dorset, marquee dorset, dorset marquee hire, wedding marquee hire
Local Website Optimization | | crazymoose780 -
Multi Location business - Should I 301 redirect duplicate location pages or alternatively No Follow tag them ?
Hello All, I have a eCommerce site and we operate out of mulitple locations. We currently have individual location pages for these locations against each of our many categories. However on the flip slide , this create alot of duplicate content. All of our location pages whether unique or duplicated have a unique title Tag, H1, H2 tag , NAP and they all bring in the City Name . The content on the duplicated content also brings in the City name as well. We have been going through our categories and writing unique content for our most popular locations to help rank on local search. Currently I've been setting up 301 redirects for the locations in the categories with the duplicated content pointing back to the category page. I am wondering whether the increase in number of 301's will do more harm than having many duplicate location pages ?.. I am sure my site is affected by the panda algorithm penalty(on the duplicated content issues) as a couple of years ago , this didn't matter and we ranked top 3 for pretty much for every location but now we are ranking between 8 - 20th depending on keyword. An Alternative I thought, may be to instead of 301 those locations pages with duplicate content, is to put No Follow tags on them instead ?... What do you think ?. It's not economically viable to write unique content for every location on every category and these would not only take years but would cost us far to much money. Our Site is currently approx 10,000 pages Any thoughts on this greatly appreciated ? thanks Pete
Local Website Optimization | | PeteC120 -
Had SEO Firm tell me to Start Over - pros and cons help please
Hi So I have quotes of 1250 to 2500 a month to run my website, seo wise. What I am told is they will do all facebook postings, 4 blog posts each month, some citations, and site optimization. Those amounts due seem like a lot. Yet I was last to start all over. Basically I was told that because of some bad backlinks, which only a few remain, that you can never recover from an algorithm penalty. And with a Disavow, its like telling Google - penalize me please So the plan was this: $3000 for a new site, and new domain, and then it has no penalties, and I will be ranking in no time. The problem is I am branded. My domain and business name is Bernese Of The Rockies. People know us and we are very respected. So if we create a new site like example.com, I do not want to mislead people. Or if there is a penalty for say a landing page or site, where I am sending people to my main site for more info type of thing. Just looking for your input if this is a common issue, where if you have a non manual, but algo penalty that you must restart? Thank you so much for your thoughts and suggestions.
Local Website Optimization | | Berner0 -
Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
Greeting Mozzers, This is a long question, so please bare with me 🙂 We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up). So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now). http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
Local Website Optimization | | CSawatzky
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training So, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location. After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it 🙂 But, still don't want to be reviewed manually lol. So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way. Thanks again and sorry for the detailed message!0 -
Location pages for Landing pages
So i have a client for carpet cleaning in Seattle, but he doesn't just want to rank up for "Carpet Cleaning Seattle" he wants to rank up for sub locations such as Lynnwood Carpet Cleaning
Local Website Optimization | | tonyr7
Kirkland Carpet Cleaning
Kenmore Carpet Cleaning
Issaquah Carpet Cleaning
Everett Carpet Cleaning
Edmonds Carpet Cleaning
Bothell Carpet Cleaning
Bellevue Carpet Cleaning
Auburn Carpet Cleaning
Orting Carpet Cleaning
Monroe Carpet Cleaning
Milton Carpet Cleaning
Marysville Carpet Cleaning
Lacey Carpet Cleaning Right now the designer he hired to develop the website has created a separate web page for each of these location pages. the reason being he services all these areas and wants to rank up for all of these areas with basically the same keyword... SEO is fairly simple to me when it comes to straight forward small sized projects or targeting specific services in one set location. But with all these algorithmic changes I worry that this is not something Google may want to see.. What is my best bet with this project, and what SEO methods would you recommend for a site that has 40 total landing pages all with similar keywords just different locations?0