Grr . . . Just can't seem to get there
-
mrswitch.com.au is one site that we are consistantly struggling with . . . It has a page rank of 3 which beats most of the competitors, but when it comes to Google AU searches such as Sydney Electrician and Electrician Sydney etc, we just can't seem to get there and the rankings keep dropping.
We backlink and update the pages on a regular basis
Any ideas? - Could it be the custom CMS system?
-
Thanks for the help! - Perfect advice!
-
No probs Steve, glad I could help
-
Awesome, yeah that does help. Thanks
-
It's a micro format of HTML.
You can use it to, for want of a better word, 'tag' your address, so basically it is a way of saying to bots that hey, you know this string of characters that follows, it is an address... there is some conjecture as to its usefulness, but I believe it is best practice to use it, especially with local search focused projects.
More info here: http://microformats.org/wiki/hcard
And a nice tool here: http://microformats.org/code/hcard/creator
I hope that helps
-
What's the score with hcards? I don't know much about it.
-
Also, when you say you backlink regularly, from where? Try to get backlinks from local sites to local pages... i.e. Sydney Electrician: get backlinks from local sydney directories, blogs & sites about Sydney, etc...
I haven't checked any of these for whether they're paid, nofollow, etc... but just as examples:
http://www.sydney-city-directory.com.au/
http://www.sydneycity.net/directory.htm
http://www.sydneybusinessdirectory.net/
http://www.expat-blog.com/en/directory/oceania/australia/sydney/
http://sydney-city.blogspot.com/
http://blogs.usyd.edu.au/sydneylife/
And obviously mix that with relevant links to do with electricians... and preferably sites that are electrician and Sidney based combined if possible... easier said that done I expect.
The same for other trades and other towns.
-
I agree fully with what Ryan suggested above, if local is your target. Also consider using the hcard microformat on your address, as that can't hurt either.
-
Are you trying to get in the local listings? If that's the case just get the address on the page and start submitting through Google Local. If it's a chain submit the bulk listings and/or service areas. Pages like this: http://www.mrswitch.com.au/location are going to hurt you as the engines will see it as a keyword stuffing attempt to manipulate results around any Sydney suburb + "Electrician". I'd recommend getting rid of the location page in its current format and replace it with a few of your actual locations. Use your keywords sparingly, and use the tools Google provides, especially mapping and reviews.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl solutions for landing pages that don't contain a robots.txt file?
My site (www.nomader.com) is currently built on Instapage, which does not offer the ability to add a robots.txt file. I plan to migrate to a Shopify site in the coming months, but for now the Instapage site is my primary website. In the interim, would you suggest that I manually request a Google crawl through the search console tool? If so, how often? Any other suggestions for countering this Meta Noindex issue?
Technical SEO | | Nomader1 -
Mobile first - what about content that you don't want to display on mobile?
ANOTHER mobile first question. Have searched the forum and didn't see something similar. Feel free to passive- aggressively link to an old thread. TL;DR - Some content would just clutter the page on mobile but is worth having on desktop. Will this now be ignored on desktop searches? Long form: We have a few ecommerce websites. We're toying with the idea of placing a lot more text on our collection/category pages. Primarily to try and set the scene for our products and sell the company a bit more effectively. It's also, obviously, an opportunity to include a couple of long tail keywords. Because mobile screens are small (duh) and easily cluttered, we're inclined _not _to display this content on mobile. In this case; will any SEO benefit be lost entirely, even to searchers on desktop? Sorry if I've completely misunderstood mobile-first indexing! Just an in-house marketing manager trying to keep up! cries into keyboard Thanks for your time.
Technical SEO | | MSGroup
Ross0 -
Landing pages showing up as HTTPS when we haven't made the switch
Hi Moz Community, Recently our tech team has been taking steps to switch our site from http to https. The tech team has looked at all SEO redirect requirements and we're confident about this switch, we're not planning to roll anything out until a month from now. However, I recently noticed a few https versions of our landing pages showing up in search. We haven't pushed any changes out to production yet so this shouldn't be happening. Not all of the landing pages are https, only a select few and I can't see a pattern. This is messing up our GA and Search Console tracking since we haven't fully set up https tracking yet because we were not expecting some of these pages to change. HTTPS has always been supported on our site but never indexed so it's never shown up in the search results. I looked at our current site and it looks like landing page canonicals are already pointing to their https version, this may be the problem. Anyone have any other ideas?
Technical SEO | | znotes0 -
How can my homepage have 2 meta descriptions?
Hi all, When googling our company, I see our main page pop up with 2 different meta descriptions, depending on the search query. The situation
Technical SEO | | NHA_DistanceLearning
The search query 'nha' (on google.nl) returns the main page with a meta description that looks like a random grab from the code by Google itself, starting with 'Ik volg een cursus bij de NHA...' The search query 'nha.nl' (on google.nl) returns the main page with the proper meta description, starting with 'Aanbieder van thuisstudies met onder meer MBO-opleidingen...'. So yeah, I'd like to have the main page only appear with the proper meta description, the latter one. We did have a redirect issue (duplicate homepages) a few weeks ago and programming fixed it. Could this have something to do with a redirect? I'd love to hear your thoughts. Thanks!0 -
How can I best handle parameters?
Thank you for your help in advance! I've read a ton of posts on this forum on this subject and while they've been super helpful I still don't feel entirely confident in what the right approach I should take it. Forgive my very obvious noob questions - I'm still learning! The problem: I am launching a site (coursereport.com) which will feature a directory of schools. The directory can be filtered by a handful of fields listed below. The URL for the schools directory will be coursereport.com/schools. The directory can be filtered by a number of fields listed here: Focus (ex: “Data Science”) Cost (ex: “$<5000”) City (ex: “Chicago”) State/Province (ex: “Illinois”) Country (ex: “Canada”) When a filter is applied to the directories page the CMS produces a new page with URLs like these: coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago coursereport.com/schools?cost=$>5000&city=buffalo&state=newyork My questions: 1) Is the above parameter-based approach appropriate? I’ve seen other directory sites that take a different approach (below) that would transform my examples into more “normal” urls. coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago VERSUS coursereport.com/schools/focus/datascience/cost/$<5000/city/chicago (no params at all) 2) Assuming I use either approach above isn't it likely that I will have duplicative content issues? Each filter does change on page content but there could be instance where 2 different URLs with different filters applied could produce identical content (ex: focus=datascience&city=chicago OR focus=datascience&state=illinois). Do I need to specify a canonical URL to solve for that case? I understand at a high level how rel=canonical works, but I am having a hard time wrapping my head around what versions of the filtered results ought to be specified as the preferred versions. For example, would I just take all of the /schools?focus=X combinations and call that the canonical version within any filtered page that contained other additional parameters like cost or city? Should I be changing page titles for the unique filtered URLs? I read through a few google resources to try to better understand the how to best configure url params via webmaster tools. Is my best bet just to follow the advice on the article below and define the rules for each parameter there and not worry about using rel=canonical ? https://support.google.com/webmasters/answer/1235687 An assortment of the other stuff I’ve read for reference: http://www.wordtracker.com/academy/seo-clean-urls http://www.practicalecommerce.com/articles/3857-SEO-When-Product-Facets-and-Filters-Fail http://www.searchenginejournal.com/five-steps-to-seo-friendly-site-url-structure/59813/ http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html
Technical SEO | | alovallo0 -
Google couldn't access your site because of a DNS error
Hello, We've being doing SEO work for a company for about 8 months and it's been working really well, we've lots of top threes and first pages. Or rather we did. Unfortunately the web host who the client uses (who to recommended them not to) has had severe DNS problems. For the last three weeks Google has been unable to access and index the website. I was hoping this was going to be a quickly resolved and everything return to normal. However this week their listing have totally dropped, 25 page one rankings has become none, Google Webmaster tools says 'Google couldn't access your site because of a DNS error'. Even searching for their own domain no longer works! Does anyone know how this will effect the site in the long term? Once the hosts sort it out will the rankings bounce back. Is there any sort of strategy for handling this problem? Ideally we'd move host but I'm not sure that is possible so any other options, or advice on how it will affect long term rankings so I can report to my client would be appreciated. Many thanks Ric
Technical SEO | | BWIRic0 -
Merged old wordpress site to new theme and have crazy amount of 4xx and duplicate content that wasn't there before?
URL is awardrealty.com We have a new website that we merged into a new wordpress theme. I just crawled the site with my seomoz crawl tool and it is showing a ridiculous amount of 4xx pages (200+) and we cant find the 4xx pages in the sitemap or within wordpress. Need some help? Am i missing something easy?
Technical SEO | | Mark_Jay_Apsey_Jr.0 -
Where to get expert SEO help?
I joined SEOmoz knowing very little about SEO (it turns out even less than I thought!) I signed up because my business website that had be ranking very well for years made a fast and furious fall to the purgatory of page 2, 3, whatever. We'll I've definitely learned a lot and made a several changes that have helped. Specifically link building (directory submissions) and eliminating duplicate content. But we're still far below where we used to be and I've done everything I can do without making a career change to SEO. I've hired a few offshore SEOs to help but they have all failed to live up to their promises. So, I would love to find a GOOD SEO that can 1. Fix the remaining on-page technical issues in our CMS website (Business Catalyst), and 2. help us develop an SEO strategy for the next year. (I prefer not to post the name of the website for competitive reasons) Our keywords are really not very competitive at all due to the uniqueness of the business. Where should I look for help? Thanks
Technical SEO | | Placeboo0