Why are we not seeing similar traffic patterns in a new market?
-
Good afternoon!
We have a large real estate site with over 400,000 urls. We do pretty well with long-tailed search terms (like addresses--- 123 Main Street, Atlanta GA) so we get a decent amount of traffic (3,500-4,000 uniques a day). 2 months ago we opened up in a new market (Nashville) and hoped to see similar traffic for that market after a few of months, but so far we haven't. In fact, we only get about 200 visits a day. I can't seem to figure out why it's taking so long for us to generate similar traffic in Nashville that we see in Atlanta. All of the Nashville properties are in our sitemap and are being indexed by Google. Any ideas why we aren't seeing similar effects?Thanks in advance for any help you can provide!
David
-
400,000 isn't an unreasonable number of pages on a real estate site if they have reasonable amounts of unique content and the pages are implemented well within the site.
That said, it's much more difficult to pull off with a site that has lower DA & PA and few links.
-
True, I don't think site size will by itself hurt you, but I do think there is something to building your site up over time.
I think having a great first impression goes a long way. If google finds the bathroom stall before it sees the grand lobby then google my just quite at " this is a crap website."
... granted, I've never built a website with 400,000 pages... I mean there are a lot of bugs in the world, but would you read 400,000 pages about pest control?
-
Here are a few of the questions I would start with from what you have asked:
- How much of each site's traffic is coming from Google?
- How many inbound links does the original site have?
- How thin is the content of the new site?
- How quickly were the 400,000 pages added to the new site?
- How many of those pages are indexed by google right now?
- How original is the content?
While domain age in of itself isn't always a factor, a site's history in the search engine is. I am going to assume that your new site is created dynamically. If you simply plop down 400,000 pages then it's going to take some time for google index and evaluate all of those pages. Whereas your older site may have started with more history. It may have begun with a few less pages and gradually built up.
From my own personal experience, I have used the same format on several different websites and while it would seem that since the formula is the same and the search engine is using the same set of rules, I still get varying results. The formula works most of the time so I just move on and let the sites simmer. If formula A never kicks in then I move to formula B.
You may want to try a different city to check your formula A and make sure that your first success wasn't just lucky.
-
Number of URLs alone independent of anything else shouldn't be a reason to randomly deindex a lot of your site to an arbitrary number. Can you offer any more background on your suggestion?
-
You say that you have "400,000 urls" and you do not seem to realize that THIS is the problem!
You need to de-index a lof your site to gain the trust back from Google.
Trust does not come from having 400.000 urls - it comes by having 200 + good pages with original content per location.
-
Hi David,
I'm following up on older questions that are still marked unanswered. Are you still seeing this discrepancy, or has it sorted itself out now that you've had the site up longer? Are you still looking for advice about this issue?
-
It looks like the disparity of traffic is largely due to your recent entry into the market, the primary contributor to the SEO factors already mentioned. Many factors are at play, as usual, and here are some interesting sidenotes:
(According to Wikipedia)
The Atlanta metropolitan area, with 5,268,860 people,<sup id="cite_ref-3" class="reference">[4]</sup> is the third largest in the Southern United States and ranks fourth in the number of Fortune 500 companies headquartered within city boundaries, behind New York City, Houston, and Dallas. Atlanta Metropolitan Area ranks as the 10th largest cybercity (high-tech center) in the US, with 126,700 high-tech jobs (tech jobs=high turnover=more home sales=internet savvy population).
Atlanta is far more dynamic in the real-estate market than Nashville
The Nashville metropolitan area = 1,600,358
Music industry professionals don't tend to move around much, especially country music industry professionals.
My opinion: with time and some SEO effort, you can reduce the traffic gap, but not close it.
-
In order to rank for specific terms you must have relevant links with the right anchor text to those pages. If you have just made more pages and are only linking to yourself then you are effectively trying to just tell everyone that you are an authority without anyone else's opinion. So if you aren't being voted for (ie linked to) by other pages to say that your Nashville pages are what they are then you may just have a whole lot of low authority pages and need to build up more value.
Taking a look at your site you only have a little over 1000 links to your site with most of them going to the domain. deep linking is going to be key to your success or else you are trying to determine your own relevance.
hope this helps
-
This is a side note to the previous comments. You can try to boost your rankings quickly (quick is relative) with social media metrics. This is a real-estate agent in my city, http://hometourgoodness.com/ he gets a lot of interaction on Facebook and Twitter. If you are able to engage in social media you will boost your in-bound traffic, and help serps.
-
For me this is guess work as I don't know the URLs, but is it possible that people in Nashvile use other search terms than you are ranking for or have other ways to search for property? ans do you use those search terms as well? I've seen somthing simular within Germany where it turned out people were using a slightly different version of the keywords.
Are all pages in the Nashville section being measured properly? Is the GA code implemented on all the pages? As you mention that you have only 70% of traffic compared to that of Atlanta.
And I think it could also be a matter of trust. is your brand new in the Nashville area? Than people might be more responsive towards your competitors.
Geddy
-
Hi Barry-
We are not ranking as well for Nashville, but we follow the same formula for links/layout in both markets. We don't have any inbound links to property pages (example: http://clickscape.com/9753-Palmeston-Place-0-Johns-Creek-GA), but we rank on the first page of SERPs in Atlanta for these similar long-tailed terms due to site architecture/navigation. The formula is not working in Nashville and I was just curious if there was a reason why that might be.
Also, you are correct. The search volume in Nashville is about 70% of the volume in Atlanta so that definitely plays a role.
-
Are you ranking as well for terms in Nashville the same as Atlanta?
I would imagine other sites are outranking you and you need to build 'Nashville' specific links into some of your pages to start showing up.
Also are the search volumes for real estate terms as large in Nashville as Atlanta?
A combination of all of these could be the cause of the lower search volumes.
-
All thing being equal, the age of a domain accounts for a very tiny amount of ranking weight, if any at all. It's not really ever a reason I would give for why a website is or is not ranking for any given set of terms.
-
First, are you being outranked by competitors? Try doing some long tail searches similiar to what you get for areas like Atlanta.
If you seem to be ranking for those searches just as well as you normally would, it could be that areas like Nashville have less heavy internet users. Keep in mind that you're likely to have more results for a good ranking result based in NYC compared to the boondocks of Oregon... simply because of the number of people in the area searching for the item in quesiton.
-
Some of the following will be guesswork since you didn't provide any URLs, but I'll try my best. This old (Atlanta targeted) website, has it been around for a (quite) longer time than the newer (Nashville) domain? Besides the amount of links the older domain has most likely collected, domain age appears to influence ranking on its own (even though only slightly, #10 in the ranking factors http://www.seomoz.org/article/search-ranking-factors#ranking-factors). Does the Nashville targeted website have the same amount of local (and related) backlinks as the Atlanta targeted website? You've mentioned that the Nashville website is only live for about 2 months, which I'd consider a really short time to draw any real conclusions to be honest.
With some more time and the same effort as you've put into the Atlanta targeted website, I'm sure the new one will perform in a similar fasion!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whats the best way to move 30% of our content behind a paywall and still get indexed without penalties and without letting people see our content before they subscribe.
Hi all - We want to create a membership program so that they can get more great stuff from us and offers, deals, etc. but only if they qualify to be a member via a purchase for example. The question is we want to move only some of our content (c.30%) behind the membership curtain - will be a mix of SEO value content. There are few questions/ concerns I am hoping you the SEO community can help me with: How can i ensure Google continues to index it without getting penalized. If i tell google bot to index but not allow Google and other sites to see the membership content will that create a penalty? Is that considered a form of cloaking? How can i prevent having to reveal 3 pages a day under Google's First Click Free set-up. I suppose i want my cake and eat it and i suspect the answer is well i cant. Any help or insights that can help me make this decision better is gratefully accepted.
Reporting & Analytics | | Adrian-phipps0 -
Alternative tools for Keyword Traffic
Hi There, Wondering if anyone has any other tools they would recommend using for finding out keyword traffic on websites. Currently (and I'm sure like most), my website is connected to Google Analytics and Google Search Console. My biggest frustration becomes the "(not set)" variable that appears when I go to review the keywords section. It's always such a large number and I have no way of finding out what people might be typing in and coming across my website. Of course, I understand the privacy factor as to why Google must do this but it's certainly difficult to analyze what's working and what's not. Any tips, tricks or suggestions are greatly appreciated! Thanks, Lindsay
Reporting & Analytics | | MainstreamMktg0 -
Why Is My Traffic Going Up But My Alexa Rating Getting Worse?
I am curious why my Alexa rating has been getting worse over the last few weeks. My organic traffic is up, direct visits and social networking visits are up but my Alexa has been getting worse and worse. Is anyone having the same problem? Any feedback would be helpful. Thanks.
Reporting & Analytics | | Videogamefan0 -
What is the difference between "Organic Traffic" and the "Non-Paid Search Traffic" default segment in Google Analytics?
These two filtering options ("organic traffic" in the left sidebar and "non-paid search traffic" in the advanced segments) give me slightly different numbers. Any idea why this would be the case?
Reporting & Analytics | | FPD_NYC1 -
Webmaster tools traffic on one keyword dropped through the floor - ideas?
Hi there, We design and sell our own product range in a narrow niche, and we are also stocked by Amazon and a lot of other big retailers in the UK. During the first two weeks of Dec 2012 the position of one of our main keywords, which was in google SERPs on page 1 (8 or 9), dropped to page 4. The keyword describes the niche we're in. The drop is shown in the webmaster tools traffic report for that keyword. But it's the only one of our keywords where this has happened, and furthermore it hasn't happened for variations of the keyword. And in Adwords our quality score for the keyword is 10 For example say we were making and selling shopping trolleys - our keyword "shopping trolleys" has dropped through the floor, but "shopping trolleys (on) wheels" is just fine. Can anyone shed any light on what's going on here? Losing this one keyword has cost us some good organic traffic. i1uxSlB.png
Reporting & Analytics | | w1ll1am0 -
Google Analytic - Is it possible to see which organic keyword triggered goals?
Hi, I am trying to see which of my Google organic keywords triggered my goals? In GA I click > Conversion > Goals > Overview > Source Medium (This then says where my goals came from but when I click Google / Organic it just brings me to the overview page of my organic traffic). Is it possible to see which organic keywords trigger goals?
Reporting & Analytics | | AdvanceSystems0 -
Large Drop in Direct Traffic
We recently experienced a large drop in direct traffic. Search and referral traffic remained steady but direct traffic dropped by over half. I'm having trouble pinpointing what would have caused this drop. Any ideas or suggestions for investigating the cause in a drop of direct traffic?
Reporting & Analytics | | AxlsCloset0 -
URL-structure change - former long-tail traffic gone
Hey people, I'm sure many of you applied changes to the URL structure of a client's or your own website before. So did I for obvious reason: The structure before was like www.domain.com/brand_page/_22-key-word-translatedkeyword.php (ranked 20). This was changed to www.domain.com/key-word.html.
Reporting & Analytics | | dumperama
Edit: Also on-page it was optimized, but only taking out worthless links like "keyword-link to other page" and adding a relevant SEO text (also valuable for the user) Now, for the targeted short-tail keyword, the outcome was great - ranking increased by 17 landing the page on the first SERP. But: Before this page garnered a wide range of long-tail keyword traffic.To be exact: 2600 different keywords generated traffic for that page in a period of 1 month. Now the newly structured site (also on-page optimized) only receives traffic from around 100 keywords. You can imagine that the absolute amount of visits also dropped. So I'd like to know if you observed similar results. Another question that's coming up in this context: How regularly does Google refresh the keywords associated with a page? Like: Is this page really relevant for this one keyword we associated it with 5 years ago? Because it is clear, when I'm looking at the aforementioned 2600 KW in detail, most don't have anything to do with the site, i.e. are not mentioned at all. Still they generated valuable traffic though. All of this is really crucial to this project, because soon the whole website's supposed to be relaunched with optimized URL structure and of course everything else that's need SEO wise... I'd love to hear your experiences. Thanks!!0