How do I figure out what's wrong with my site?
-
I'm fairly new to SEO and can't pinpoint what's wrong with my site...I feel so lost. I am working on revamping www.RiverValleyGroup.com and can't figure out why it's not ranking for keywords. These keywords include 'Louisville homes', 'Homes for sale in Louisville KY', etc. Any suggestions?
I write new blog posts everyday so I feel there's no shortage of fresh content. I'm signed up with Moz Analytics and Google analytics
-
I checked Moz's Open Site Explorer and Ahrefs which are both good sources of backlink data.
Structured data is a nice thing to have but wouldn't necessarily hurt your ranking. It can help Google more easily make sense of your content and also help you stand out a bit more in the SERPs if Google chooses to show a rich snippet result for you.
Getting rid of bad backlinks can be a manual task of reaching out and contacting webmasters and there are some tools that can make that process a little less time consuming. However, I didn't review your links for quality, just noting there were a lot of links from a small number of domains. The top one looked like the personal site of the company owner or broker.
-
Marty-
Thank you for the quick response....super helpful!
I didn't know I had thousands of backlinks...how do you check something like that? Also if they are bad backlinks how do I go about getting rid of them?
I was just looking at my Webmaster Tools account with Google and it was saying it couldn't detect any structured data on my site...could this be hurting me SEO-wise? Thanks!
-
Two people gave you a huge amount of information here...
http://moz.com/community/q/seo-frustration-is-my-website-too-busy
You never replied.
-
Greetings James! Welcome to the fun-filled and often onerous world of SEO.
Just taking a quick look at your site, you seem to have a lot of content pages, etc. but very little in the way of trust flowing to your site. For example, I can see you have thousands of backlinks but they're only spread out over forty or so unique domains.
As real estate is often local, you'd do well to try and rank in the local pack results for real estate related searches, however I notice a Google+ page that looks like your's but it doesn't have the same address as your website. You want that to match up and then also start building credibility to your Google+ page through positive reviews, etc. from users.
Real estate is a super-competitive niche and your best bet is to (at least until you have more trust) target more of the long tail of search. Those are just a few tips to get you started but anytime you're doing a competitive niche in a big city / region, it's not going to be a quick and easy task. Keep at it though; you'll get there!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our Sites Organic Traffic Went Down Significantly After The June Core Algorithm Update, What Can I Do?
After the June Core Algorithim Update, the site suffered a loss of about 30-35% of traffic. My suggestions to try to get traffic back up have been to add metadata (since the majority of our content is lacking it), as well ask linking if possible, adding keywords to alt images, expanding and adding content as it's thin content wise. I know that from a technical standpoint there are a lot of fixes we can implement, but I do not want to suggest anything as we are onboarding an SEO agency soon. Last week, I saw that traffic for the site went back to "normal" for one day and then saw a dip of 30% the next day. Despite my efforts, traffic has been up and down, but the majority of organic traffic has dipped overall this month. I have been told by my company that I am not doing a good job of getting numbers back up, and have been given a warning stating that I need to increase traffic by 25% by the end of the month and keep it steady, or else. Does anyone have any suggestions? Is it realistic and/or possible to reach that goal?
Algorithm Updates | | NBJ_SM2 -
What does it mean to build a 'good' website.
Hi guys. I've heard a lot of SEO professionals, Google, (and Rand in a couple of whiteboard Friday's) say it's really important to build a 'good' website if you want to rank well. What does this mean in more practical terms? (Context... I've found some sites rank much better than they 'should' do based on the competition. However, when I built my own site (well-optimised (on-page) based on thorough keyword research) it was nowhere to be found (not even top 50 after I'd 'matched' the backlink profile of others on page 1). I can only put this down to there being 'good quality website' signals lacking in the latter example. I'm not a web developer so the website was the pretty basic WordPress site.)
Algorithm Updates | | isaac6630 -
Does it matter? 404 v.s. 302 > Page Not Found
Hey Mozers, What are your thoughts of this situation i'm stuck in all inputs welcome 🙂 I am in the middle of this massive domain migration to a new server. Also we are going to be having a very clean SEO friendly url structure. While I was doing some parsing and cleaning up some old urls I stumbled upon a strange situation on my website. I have a bunch of "dead pages" and they are 302'd to a "page not found" probably a old mistake of one of the past developers. (To clarify the HTTP Status code is not 404) Should I try to fight to get all these "dead pages" a 404 error code or could I just leave the temp redirect 302 > "page not found" ( even though I know for a fact theses pages are not going to turn on again)
Algorithm Updates | | rpaiva0 -
SEO having different effects for different sites
Hi, I hope this isn't a dumb question, but I was asked by a local company to have a look at their website and make any suggestions on how to strengthen and improve their rankings. After time spent researching their competitors, and analysing their own website I was able to determine that they are actually in a good position. The have a well structured site that follows the basic search rules, they add new relevant content regularly and are working on their social strategy. Most of their pages are rated A within Moz, and they spend a lot of time tweaking the site. When I presented this to them, they asked why there are sites that rank above them that don't seem to take as much care over their website. For example, one of their main competitors doesn't engage in any social networking, and rarely adds content to their site. I was just wondering if anyone could shed any light on why this happens? I appreciate there's probably no simple answer, but it would be great to hear some different input. Many thanks
Algorithm Updates | | dantemple880 -
SERP's & Search Engine Differences
Hey, I recently modified my pages to conform more closely to the "A" page rankings for MOZ's on-page report card but saw declines in my keyword rankings. They keywords in question appear in my title tag, description, one image alt tag, either an h1 or h2 tag, and 4 times throughout the text of the document. I don't think MOZ would recommend these changes if it was seen as stuffing - is there any other reason why my rankings might have dropped by 1-4 positions? Also, does anyone know of a good article/book for Yahoo/Bing SEO? My Yahoo & Bing rankings are far below Google's in most cases. Any help would be appreciated! -Michael
Algorithm Updates | | Stew2220 -
Does google have the worst site usability?
Google tells us to make our sites better for our readers, which we are doing, but do you think google has horrible site usabilty? For example, in webmaster tools, I'm always being confused by their changes and the way they just drop things. In the HTML suggestions area, they don't tell you when the data was last updated, so the only way to tell is to download the files and check. In the URL removals, they used to show you the URLs they had removed. Now that is gone and the only way you can check is to try adding one. We don't have any URL parameters, so any parameters are as a result of some other site tacking on stuff at the end of our URL and there is no way to tell them that we don't have any parameters, so ignore them all. Also, they add new parameters they find on the end of the list, so the only way to check is to click through to the end of the list.
Algorithm Updates | | loopyal0 -
Question relates to mobile site & duplicate content.
We are working on the mobile version of a large site (migraine.com) and will be using a separate theme for it (directing visitors to m.migraine.com)- what are the necessary code or other important step we should take so that we do get penalized for having duplicate content? Thank you in advance for your responses
Algorithm Updates | | OlivierChateau0 -
When Panda's attack...
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done. First, the issues: Content Length The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it. Visit Length as a Metric There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place. My strategy so far… Noindex some Pages Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value. Create more click incentives We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click. Expand Content (of course) The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel. Site Redesign Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page. What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
Algorithm Updates | | sprynewmedia0