My site has disapeared from the serps. Could someone take a look at it for me and see if they can find a reason why?
-
my site has disappeared from the serps. Could someone take a look at it for me and see if they can find a reason why? It used to rank around 4 for the search "austin wedding venues" and it still ranks number three for this search on Bing. I haven't done any SEO work on it in a while so i don't think i did anything to make Google mad but now it doesn't even rank anywhere in the top 160 results. Here's the link:
http://austinweddingvenues.org
Thanks in advance Mozzers!
Ron
-
Actually, what you need to build is more content.
-
One more question....being as this site has very few links, would it hurt or help to add about 10 of the highest page rank directory links on the seomoz directory list before getting some natural links first?
Thanks again everyone!
-
Yes, i think it is EMD. i'll build more thanks, thanks for the input!
-
I just checked and don't see any messages that are saying i have an issue. I think this was EMD.
-
I just read up on the EMD algo change and i think that was it. i'm going to have to build more links to get it back i think....thanks everyone for the input!
-
No, i always keep my customized search set to off. i checked my ranking by typing in "austin wedding venues" in google in internet explorer. This isn't just a simple drop from competitors building links and knocking me down a few places. my site has disappeared from the serps. any clue as to why this happened? I'm stumped....
-
One more suggestion. You are not using .htaccess to either force www or force non-www. From the way you typed your link above, it seems you don't want to use www. Therefore, put this in your .htaccess file..
# Redirect www urls to non-www RewriteEngine on RewriteCond %{HTTP_HOST} ^www\.
austinweddingvenues\.org [NC] RewriteRule (.*) http://
austinweddingvenues.org/$1 [R=301,L]
Be sure to also specify this in Google Webmaster tools. You will need to verify that you have access to both www and non-www, so do that first before adding the lines above to your .htaccess file.
-
Good advice from these people. Check Google Webmaster tools for any notifications. I'm leaning towards Deb's explanation: Exact Match Domain.
You only have 17 root domains linking to you. Try to increase that
You also only have 2 pages indexed in Google. Have you submitted your sitemap to them? Have you checked to make sure you aren't blocking the bots?
-
when did you lose the ranking .. seem like EMD is the culprit but no way to tell unless we have the exact date
-
Hi Ron,
The first thing to ask is when you were seeing your website ranking around 4th how did you do this? Did you use a ranking tool or your browser. The reason I ask is that when using a browser your search results can be personalized to you. So if you often go to your website then you may see it for a keyword search when really it isn't placed there. To be sure exactly where you rank use a non-personalized search or a ranking tool like the one on SEOMoz.
Second, it doesn't look like you are doing anything odd on-site (i.e. no black-hat seo tactics) so that's good. And, your backlinks are low so not sure if that would be the case but you need to have a look at Google Webmaster Tools to confirm.
Thirdly, a drop in rankings could be due to competition. If your competitors are actively optimizing their website and building quality links and you aren't then they can knock you off the rankings.
Hope that helps some.
Davinia
-
Shot in the dark here, but have you checked your Google Webmaster account for any messages from Google? I know some folks whose rankings went way down and didn't know why until they checked and found the dreaded "unnatural links" letter waiting for them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site build in the 80% of canonical URLs - What is the impact on visibility?
Hey Everyone, I represent international wall decorations store where customer can freely choose a pattern to be printed on a given material among a few milions of patterns. Due to extreme large number of potential URL combinations we struggle with too many URL adressess for a months now (search console notifications). So we finally decided to reduce amount of products with canonical tag. Basing on users behavior, our business needs and monthly search volume data we selected 8 most representative out of 40 product categories and made them canonical toward the rest. For example: If we chose 'Canvas prints' as our main product category, then every 'Framed canvas' product URL points rel=canonical tag toward its equivalent URL within 'Canvas prints' category. We applied the same logic to other categories (so "Vinyl wall mural - Wild horses running" URL points rel=canonical tag to "Wall mural - Wild horses running" URL, etc). In terms of Googlebot interpretation, there are really tiny differences between those Product URLs, so merging them with rel=canonical seems like a valid use. But we need to keep those canonicalised URLs for users needs, so we can`t remove them from a store as well as noindex does not seem like an good option. However we`re concerned about our SEO visibility - if we make those changes, our site will consist of ~80% canonical URLs (47,5/60 millions). Regarding your experience, do you have advices how should we handle that issue? Regards
White Hat / Black Hat SEO | | _JediMindBender
JMB0 -
Backlink, how to delete or find who is linking to me?
Hi there guys, Can someone tell me how I go about finding who is linking to my site or how to find backlinks to my site and if it is a spam site or a site I don't know or want linking to me, how to stop them from linking to me and also how to delete their link? Thanks appreciate the time Cheers
White Hat / Black Hat SEO | | edward-may0 -
Does this graph look like a Penguin 2.0 hit?
Hello,Does the attached graph look like a Penguin 2.0 hit? Keep in mind that on our eCommerce site most purchases are from return customers. I forgot to add here that we cut a bunch of paid links in May 2013 as well. We quit cutting paid links when our rankings dropped - we thought it was the paid links. We currently have 30% paid links. Penguin 2.0 was on May 22. ga2.png
White Hat / Black Hat SEO | | BobGW0 -
Can a hidden menu damage a website page?
Website (A) - has a landing page offering courses Website (B) - ( A different organisation) has a link to Website A. The goal landing page when you click on he link takes you to Website A's Courses page which is already a popular page with visitors who search for or come directly into Website A. Owners of Website A want to ADD an Extra Menu Item to the MENU BAR on their Courses page to offer some specific courses to visitors who come from Website (B) to Website (A) - BUT the additional MENU ITEM is ONLY TO BE DISPLAYED if you come from having clicked on the link at Website (B). This link both parties are intending to track However, if you come to the Courses landing page on Website (A) directly from a search engine or directly typing in the URL address of the landing page - you will not see this EXTRA Menu Item with its link to courses, it only appears should you visit Website (A) having come from Website (B). The above approach is making me twitch as to what the programmer wants to do as to me this looks like a form of 'cloaking'. What I am not understanding that Website (A) URL ADDRESS landing page is demonstrating outwardly to Google a Menu Bar that appears normal, but I come to the same URL ADDRESS from Website (B) and I end up seeing an ADDITIONAL MENU ITEM How will Google look at this LANDING PAGE? Surely it must see the CODING INSTRUCTIONS sitting there behind this page to assist it in serving up in effect TWO VERSIONS of the page when actually the URL itself does not change. What should I advise the developer as I don't want the landing page of Website (A) which is doing fine right now, end up with some sort of penalty from the search engines through this exercise. Many thanks in advance of answers from the community.
White Hat / Black Hat SEO | | ICTADVIS0 -
Massive site-wide internal footer links to doorway pages: how bad is this?
My company has stuffed several hundred links into the footer of every page. Well, technically not the footer, as they're right at the end of the body tag, but basically the same thing. They are formatted as follows: [" href="http://example.com/springfield_oh_real_estate.htm">" target="_blank">http://example.com/springfield_pa_real_estate.htm">](</span><a class= "http://example.com/springfield_oh_real_estate.htm")springfield, pa real estate These direct to individual pages that contain the same few images and variations the following text that just replace the town and state: _Springfield, PA Real Estate - Springfield County [images] This page features links to help you Find Listings and Homes for sale in the Springfield area MLS, Springfield Real Estate Agents, and Springfield home values. Our free real estate services feature all Springfield and Springfield suburban areas. We also have information on Springfield home selling, Springfield home buying, financing and mortgages, insurance and other realty services for anyone looking to sell a home or buy a home in Springfield. And if you are relocating to Springfield or want Springfield relocation information we can help with our Relocation Network._ The bolded text links to our internal site pages for buying, selling, relocation, etc. Like I said, this is repeated several hundred times, on every single page on our site. In our XML sitemap file, there are links to: http://www.example.com/Real_Estate/City/Springfield/
White Hat / Black Hat SEO | | BD69
http://www.example.com/Real_Estate/City/Springfield/Homes/
http://www.example.com/Real_Estate/City/Springfield/Townhomes/ That direct to separate pages with a Google map result for properties for sale in Springfield. It's accompanied by the a boilerplate version of this: _Find Springfield Pennsylvania Real Estate for sale on www.example.com - your complete source for all Springfield Pennsylvania real estate. Using www.example.com, you can search the entire local Multiple Listing Service (MLS) for up to date Springfield Pennsylvania real estate for sale that may not be available elsewhere. This includes every Springfield Pennsylvania property that's currently for sale and listed on our local MLS. Example Company is a fully licensed Springfield Pennsylvania real estate provider._ Google Webmaster Tools is reporting that some of these pages have over 30,000 internal links on our site. However, GWT isn't reporting any manual actions that need to be addressed. How blatantly abusive and spammy is this? At best, Google doesn't care a spit about it , but worst case is this is actively harming our SERP rankings. What's the best way to go about dealing with this? The site did have Analytics running, but the company lost the account information years ago, otherwise I'd check the numbers to see if we were ever hit by Panda/Penguin. I just got a new Analytics account implemented 2 weeks ago. Of course it's still using deprecated object values so I don't even know how accurate it is. Thanks everyone! qrPftlf.png0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
How Does This Site Get Away With It?
The following site is huge in the movie trailer industry: http://bit.ly/18B6tF It ranks #3 in Google for "Movie Trailers" and has high rankings for multiple other major keywords in the industry. Here's the thing; virtually all of their movie trailer pages contain copy/pasted content from other sites. The movie trailer descriptions are the ones given by the movie companies and therefor the same content is on thousands of websites/blogs. We all know Google hates duplicate content at the moment... so how does this site get a away with it? Does it's root-domain authority keep it up there?
White Hat / Black Hat SEO | | superlordme0 -
How does someone rank page one on google for one domain for over 150 keywords?
A local seo is exclaiming his fantastic track record for a pool company(amonst others) in our local market. Over 150 keywords on page one of google. I checked out a few things using some moz tools and didn't find anything that would suggest that this has come from white hat strategies, tactics or links etc. Interested in how he is doing this and if it is white hat? Thanks, C
White Hat / Black Hat SEO | | charlesgrimm0