Target term hits a glass ceiling despite A grade
-
Greetings from 13 degrees C wetherby UK
Ive hit a roadbloack in my attempts to get a target term onto page one, below is a url pointing to a graph illustrting the situation. The target term is on the graph (I'm reluctant to stick it in here incase this page comes up)
http://i216.photobucket.com/albums/cc53/zymurgy_bucket/glass-ceiling-office-to-let.jpg
This is what Ive done to date for page -
http://www.sandersonweatherall.co.uk/office-to-let-leeds/1. Ensured the Markup follows SEO best parctice
2. Internally linked to the page via a scrolling footer
3. Shortened the URL
4. Requested the Social media efforts points links to the page
5. Requested additional contentBut i wonder... Is the reason for hitting a glass ceiling now down to lack of content ie just one page or is there a deeper issue of an indexing road block?
Any insights welcome
-
The points which you've mentioned/explained is only related to optimization. What exactly are you doing on promotion part.
I guess you must be promoting this page via:
- Guest Blogging
- Forum Participation
- Directory Listing (High quality directories, regardless of NO/DoFollow and PR)
- Article Syndication
- PRs, etc.
- Sharing content on FB, Twitter, G+, etc. (very important)
If not then start immediately. Optimization helps in better indexing but not directly in SE Rankings.
Apply variation in Anchor Texts while developing links, so that it looks natural. for e.g:
- office to let deals in Leeds
- office to let Leeds
- office to let in Leeds, etc.
It appears as if the bounce of this page would be relatively high, NO?
-
Hitting the A grade in terms of on page, in most cases, doesn't mean your site will make it to page one. So the next step is to delve deeper into your off page.
Couple of Suggestions:
- Maybe support the page more by linking through from your own blog (http://thesandersonweatherallblog.com/) with useful content. Have your social team spread the message to your target audience and you should earn a few back links for this.
- Take link data pulls from OSE or Majestic of your competition who are ranking on page one for your target term and have a look at the types of links they use and see if you can "piggy back" of their methods.
- With the geographic nature of your target search term, you will always be up against places listings within the SERP, it may be worth optimising you places listing at your office in the centre of Leets around this search term.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Lower Level Pages Being Ranked for Key Terms
Good Afternoon We've been having problems with a site for a little while now. It had a penalty (partial link) a few years ago and never really recovered back to it's full potential despite the fact that the penalty was eventually removed and we've since changed the domain completely as well as moving over to https and left behind / disavowed bad links. In the Moz ranking stats now, I'm seeing that some of our lower level pages are ranking for core terms and the erratic nature of the ranking graph seems to indicate that Google is confused and not knowing what page to pull. For example, the top level page would be Hotel in Spain but the page that is ranking for that term is one of the individual hotel information (lower level) pages lets say the Holiday Inn . The lower level page has info on the individual property but also makes reference to it being a "Cheap Hotel In Spain" My suggestion to resolve the problem is to scale back the references to the top level terms on the hotel pages and reintroduce breadcrumb links to help Google follow the structure of the site again Does this sound reasonable or would anyone be able to suggest anything else to try?
Technical SEO | | Ham19790 -
Search Console rejecting XML sitemap files as HTML files, despite them being XML
Hi Moz folks, We have launched an international site that uses subdirectories for regions and have had trouble getting pages outside of USA and Canada indexed. Google Search Console accounts have finally been verified, so we can submit the correct regional sitemap to the relevant search console account. However, when submitting non-USA and CA sitemap files (e.g. AU, NZ, UK), we are receiving a submission error that states, "Your Sitemap appears to be an HTML page," despite them being .xml files, e.g. http://www.t2tea.com/en/au/sitemap1_en_AU.xml. Queries on this suggest it's a W3 Cache plugin problem, but we aren't using Wordpress; the site is running on Demandware. Can anyone guide us on why Google Search Console is rejecting these sitemap files? Page indexation is a real issue. Many thanks in advance!
Technical SEO | | SearchDeploy0 -
Google Indexing Development Site Despite Robots.txt Block
Hi, A development site that has been set-up has the following Robots.txt file: User-agent: * Disallow: / In an attempt to block Google indexing the site, however this isn't the case and the development site has since been indexed. Any clues why this is or what I could do to resolve it? Thanks!
Technical SEO | | CarlWint0 -
Are backlinks the reason for my site's much lower SERP ranking, despite similar content?
Hi all, I'm trying to determine why my site (surfaceoptics.com) ranks so much lower than my competitor's sites. I do not believe the site / page content explains this differential in ranking, and I've done on-site / on-page SEO work without much or any improvement. In fact I believe my site is very similar in quality to competitor sites that rank much higher for my target keyword of: hyperspectral imaging. This leads me to believe there is a technical problem with the site that I'm not seeing, or that the answer lies in our backlink profile. The problem is that I've compared our site with 4 of our competitors in the Open Site Explorer and I'm not seeing a strong trend when it comes to backlinks either. Some competitors have more links / better backlink profiles but then other sites have no external links to their pages and lower PA and DA and still outrank us by 30+ positions. How should I go about determining if the problem is backlinks or some technical issue with the site?
Technical SEO | | erin_soc0 -
Website not ranking for noncompetitive terms
Hi, We've took over a website last July and no matter what we do we just can't get it ranking in Google, even for noncompetitive terms. here is the website in question: http://www.alignandsmile.co.uk Ideally the client would like to rank for Canary Wharf but that location is competitive, the site doesn't even rank for 'Dentist New Providence Wharf E14' despite it being included in the title tag on the home page and in the content throughout the website. Directories with Align and Smile's business information do rank however. I opened a case with google through Webmaster tools and they 'reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google.' So I'm a bit stuck. The site ranks top for the keyphrase in Bing and Yahoo...we are really struggling with Google! Any help would be much appreciated. many thanks Marcus
Technical SEO | | dentaldesign0 -
Duplicate pages in Google index despite canonical tag and URL Parameter in GWMT
Good morning Moz... This is a weird one. It seems to be a "bug" with Google, honest... We migrated our site www.three-clearance.co.uk to a Drupal platform over the new year. The old site used URL-based tracking for heat map purposes, so for instance www.three-clearance.co.uk/apple-phones.html ..could be reached via www.three-clearance.co.uk/apple-phones.html?ref=menu or www.three-clearance.co.uk/apple-phones.html?ref=sidebar and so on. GWMT was told of the ref parameter and the canonical meta tag used to indicate our preference. As expected we encountered no duplicate content issues and everything was good. This is the chain of events: Site migrated to new platform following best practice, as far as I can attest to. Only known issue was that the verification for both google analytics (meta tag) and GWMT (HTML file) didn't transfer as expected so between relaunch on the 22nd Dec and the fix on 2nd Jan we have no GA data, and presumably there was a period where GWMT became unverified. URL structure and URIs were maintained 100% (which may be a problem, now) Yesterday I discovered 200-ish 'duplicate meta titles' and 'duplicate meta descriptions' in GWMT. Uh oh, thought I. Expand the report out and the duplicates are in fact ?ref= versions of the same root URL. Double uh oh, thought I. Run, not walk, to google and do some Fu: http://is.gd/yJ3U24 (9 versions of the same page, in the index, the only variation being the ?ref= URI) Checked BING and it has indexed each root URL once, as it should. Situation now: Site no longer uses ?ref= parameter, although of course there still exists some external backlinks that use it. This was intentional and happened when we migrated. I 'reset' the URL parameter in GWMT yesterday, given that there's no "delete" option. The "URLs monitored" count went from 900 to 0, but today is at over 1,000 (another wtf moment) I also resubmitted the XML sitemap and fetched 5 'hub' pages as Google, including the homepage and HTML site-map page. The ?ref= URls in the index have the disadvantage of actually working, given that we transferred the URL structure and of course the webserver just ignores the nonsense arguments and serves the page. So I assume Google assumes the pages still exist, and won't drop them from the index but will instead apply a dupe content penalty. Or maybe call us a spam farm. Who knows. Options that occurred to me (other than maybe making our canonical tags bold or locating a Google bug submission form 😄 ) include A) robots.txt-ing .?ref=. but to me this says "you can't see these pages", not "these pages don't exist", so isn't correct B) Hand-removing the URLs from the index through a page removal request per indexed URL C) Apply 301 to each indexed URL (hello BING dirty sitemap penalty) D) Post on SEOMoz because I genuinely can't understand this. Even if the gap in verification caused GWMT to forget that we had set ?ref= as a URL parameter, the parameter was no longer in use because the verification only went missing when we relaunched the site without this tracking. Google is seemingly 100% ignoring our canonical tags as well as the GWMT URL setting - I have no idea why and can't think of the best way to correct the situation. Do you? 🙂 Edited To Add: As of this morning the "edit/reset" buttons have disappeared from GWMT URL Parameters page, along with the option to add a new one. There's no messages explaining why and of course the Google help page doesn't mention disappearing buttons (it doesn't even explain what 'reset' does, or why there's no 'remove' option).
Technical SEO | | Tinhat0 -
Possible penguin hit but then back, now what's next?
hiz, i did a little check on my site by answering the quiz at mytrafficdropped.com and there was a question about on what dates there was drop in organic. and i did checked my analytics on a top sending keyword. here is what i found. see attached image . Traffic dropped totally on April 20 to onwards. Then got back better in june, but again dropped in October, still down.. anythoughts guys ? 1Jk47.png
Technical SEO | | wickedsunny10 -
Global SEO Targeting
Hi, I have a website currently on the domain example.co.uk (.com is not available) I'm looking to enter other markets such as Brazil, Russia - obviously content will need to change to suit the desired market / language. I'm looking for some information on the best practice to enter foreign markets. I was thinking maybe to create individual sites for each location eg: example.br example.ru This way I could localise each site in terms of business directories, content, language etc. Or have my example.co.uk with various languages on it Experience, suggestions are welcomed - thanks.
Technical SEO | | Socialdude0