Why are we not seeing similar traffic patterns in a new market?
-
Good afternoon!
We have a large real estate site with over 400,000 urls. We do pretty well with long-tailed search terms (like addresses--- 123 Main Street, Atlanta GA) so we get a decent amount of traffic (3,500-4,000 uniques a day). 2 months ago we opened up in a new market (Nashville) and hoped to see similar traffic for that market after a few of months, but so far we haven't. In fact, we only get about 200 visits a day. I can't seem to figure out why it's taking so long for us to generate similar traffic in Nashville that we see in Atlanta. All of the Nashville properties are in our sitemap and are being indexed by Google. Any ideas why we aren't seeing similar effects?Thanks in advance for any help you can provide!
David
-
400,000 isn't an unreasonable number of pages on a real estate site if they have reasonable amounts of unique content and the pages are implemented well within the site.
That said, it's much more difficult to pull off with a site that has lower DA & PA and few links.
-
True, I don't think site size will by itself hurt you, but I do think there is something to building your site up over time.
I think having a great first impression goes a long way. If google finds the bathroom stall before it sees the grand lobby then google my just quite at " this is a crap website."
... granted, I've never built a website with 400,000 pages... I mean there are a lot of bugs in the world, but would you read 400,000 pages about pest control?
-
Here are a few of the questions I would start with from what you have asked:
- How much of each site's traffic is coming from Google?
- How many inbound links does the original site have?
- How thin is the content of the new site?
- How quickly were the 400,000 pages added to the new site?
- How many of those pages are indexed by google right now?
- How original is the content?
While domain age in of itself isn't always a factor, a site's history in the search engine is. I am going to assume that your new site is created dynamically. If you simply plop down 400,000 pages then it's going to take some time for google index and evaluate all of those pages. Whereas your older site may have started with more history. It may have begun with a few less pages and gradually built up.
From my own personal experience, I have used the same format on several different websites and while it would seem that since the formula is the same and the search engine is using the same set of rules, I still get varying results. The formula works most of the time so I just move on and let the sites simmer. If formula A never kicks in then I move to formula B.
You may want to try a different city to check your formula A and make sure that your first success wasn't just lucky.
-
Number of URLs alone independent of anything else shouldn't be a reason to randomly deindex a lot of your site to an arbitrary number. Can you offer any more background on your suggestion?
-
You say that you have "400,000 urls" and you do not seem to realize that THIS is the problem!
You need to de-index a lof your site to gain the trust back from Google.
Trust does not come from having 400.000 urls - it comes by having 200 + good pages with original content per location.
-
Hi David,
I'm following up on older questions that are still marked unanswered. Are you still seeing this discrepancy, or has it sorted itself out now that you've had the site up longer? Are you still looking for advice about this issue?
-
It looks like the disparity of traffic is largely due to your recent entry into the market, the primary contributor to the SEO factors already mentioned. Many factors are at play, as usual, and here are some interesting sidenotes:
(According to Wikipedia)
The Atlanta metropolitan area, with 5,268,860 people,<sup id="cite_ref-3" class="reference">[4]</sup> is the third largest in the Southern United States and ranks fourth in the number of Fortune 500 companies headquartered within city boundaries, behind New York City, Houston, and Dallas. Atlanta Metropolitan Area ranks as the 10th largest cybercity (high-tech center) in the US, with 126,700 high-tech jobs (tech jobs=high turnover=more home sales=internet savvy population).
Atlanta is far more dynamic in the real-estate market than Nashville
The Nashville metropolitan area = 1,600,358
Music industry professionals don't tend to move around much, especially country music industry professionals.
My opinion: with time and some SEO effort, you can reduce the traffic gap, but not close it.
-
In order to rank for specific terms you must have relevant links with the right anchor text to those pages. If you have just made more pages and are only linking to yourself then you are effectively trying to just tell everyone that you are an authority without anyone else's opinion. So if you aren't being voted for (ie linked to) by other pages to say that your Nashville pages are what they are then you may just have a whole lot of low authority pages and need to build up more value.
Taking a look at your site you only have a little over 1000 links to your site with most of them going to the domain. deep linking is going to be key to your success or else you are trying to determine your own relevance.
hope this helps
-
This is a side note to the previous comments. You can try to boost your rankings quickly (quick is relative) with social media metrics. This is a real-estate agent in my city, http://hometourgoodness.com/ he gets a lot of interaction on Facebook and Twitter. If you are able to engage in social media you will boost your in-bound traffic, and help serps.
-
For me this is guess work as I don't know the URLs, but is it possible that people in Nashvile use other search terms than you are ranking for or have other ways to search for property? ans do you use those search terms as well? I've seen somthing simular within Germany where it turned out people were using a slightly different version of the keywords.
Are all pages in the Nashville section being measured properly? Is the GA code implemented on all the pages? As you mention that you have only 70% of traffic compared to that of Atlanta.
And I think it could also be a matter of trust. is your brand new in the Nashville area? Than people might be more responsive towards your competitors.
Geddy
-
Hi Barry-
We are not ranking as well for Nashville, but we follow the same formula for links/layout in both markets. We don't have any inbound links to property pages (example: http://clickscape.com/9753-Palmeston-Place-0-Johns-Creek-GA), but we rank on the first page of SERPs in Atlanta for these similar long-tailed terms due to site architecture/navigation. The formula is not working in Nashville and I was just curious if there was a reason why that might be.
Also, you are correct. The search volume in Nashville is about 70% of the volume in Atlanta so that definitely plays a role.
-
Are you ranking as well for terms in Nashville the same as Atlanta?
I would imagine other sites are outranking you and you need to build 'Nashville' specific links into some of your pages to start showing up.
Also are the search volumes for real estate terms as large in Nashville as Atlanta?
A combination of all of these could be the cause of the lower search volumes.
-
All thing being equal, the age of a domain accounts for a very tiny amount of ranking weight, if any at all. It's not really ever a reason I would give for why a website is or is not ranking for any given set of terms.
-
First, are you being outranked by competitors? Try doing some long tail searches similiar to what you get for areas like Atlanta.
If you seem to be ranking for those searches just as well as you normally would, it could be that areas like Nashville have less heavy internet users. Keep in mind that you're likely to have more results for a good ranking result based in NYC compared to the boondocks of Oregon... simply because of the number of people in the area searching for the item in quesiton.
-
Some of the following will be guesswork since you didn't provide any URLs, but I'll try my best. This old (Atlanta targeted) website, has it been around for a (quite) longer time than the newer (Nashville) domain? Besides the amount of links the older domain has most likely collected, domain age appears to influence ranking on its own (even though only slightly, #10 in the ranking factors http://www.seomoz.org/article/search-ranking-factors#ranking-factors). Does the Nashville targeted website have the same amount of local (and related) backlinks as the Atlanta targeted website? You've mentioned that the Nashville website is only live for about 2 months, which I'd consider a really short time to draw any real conclusions to be honest.
With some more time and the same effort as you've put into the Atlanta targeted website, I'm sure the new one will perform in a similar fasion!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Alternative tools for Keyword Traffic
Hi There, Wondering if anyone has any other tools they would recommend using for finding out keyword traffic on websites. Currently (and I'm sure like most), my website is connected to Google Analytics and Google Search Console. My biggest frustration becomes the "(not set)" variable that appears when I go to review the keywords section. It's always such a large number and I have no way of finding out what people might be typing in and coming across my website. Of course, I understand the privacy factor as to why Google must do this but it's certainly difficult to analyze what's working and what's not. Any tips, tricks or suggestions are greatly appreciated! Thanks, Lindsay
Reporting & Analytics | | MainstreamMktg0 -
What will be configuration for new version of tag manager for given below code?
Hello Expert, I am using new version of tag manager for enhance ecommerce. Now i have post related to enhance ecommerce for old version of tag manager this one - https://developers.google.com/tag-manager/enhanced-ecommerce In this post, below is the configuration of "Measuring Views of Product Details" for old version of tag manager, can you please tell me what will be configuration for new version of tag manager? ( mainly basic setting and firing rule ) Tag type : Universal Analytics
Reporting & Analytics | | bkmitesh
Track type : Pageview
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
Basic Settings - Document Path: {{url path}}
Firing Rule: {{event}} equals gtm.js Thanks! BK Mitesh0 -
Huge Traffic Drop after 301, Keyword and Schema.org Fixes
Hello there, I'm first gonna explain what I did to my website: I was using a 302 redirect to send from http to https, fixed it to a 301. My url has a keyword and I was using many pages with keywords as well. ex) www.keywordhaha.com/keyword-the-best , www.keywordhaha.com/keyword-easiest-on-keyword-market Changed it to : www.keywordhaha.com/app , www.keywordhaha.com/games, etc... I was not using any crawler tools, so I added Schema.org, Json-LD and rdfa-node, which are all working properly. Synced my page with our Google+ page, which was recognised by Google Added a proper logo and fb:admins, and was recognised by facebook. After I did all this optimisations, I experienced an immediate traffic drop (10%) and my impressions/clicks according to the webmaster tools dropped 75%, in a 2 day period. Any ideas where there could have been a mistake? mPdhFdG.png
Reporting & Analytics | | jancpc0 -
Improved keyword ranking but less traffic
Hello fellow mozzers! My collegue and I are a bit puzzled in regards to our recent website statistics. In november 2013, we upgraded the technological platform of our website to be fully HTML5 coded, and implemented the schema.org Products scheme to systematically tag all our products on the site. To prevent too much loss of visitors, we created a 301-redirect table from almost all our old URL's to the specific new ones, as we implemented a new URL structure as well. The first few months were bumpy as expected, making a huge drop in rankings before rising up again. Our keyword rankings are better then ever (60% of the keywords in top 3, average competition, 25% more on first SERP) but our number of visits dropped by about 10%. Our bounce rate went down from 20% to 14%, our returning visits are stable, but our new visitors stats dropped by 25% as well. This comparison was made between equal periods in the current year and last year, using organic data stats. (new technical platform vs. the old one) What could be the reason that our number of visits dropped 10% while our keyword ranking is better then ever? We don't have any manual penalties in GWT and can't understand why visits would drop so much while ranking improved. May it be so easy that there's just less search volume on our ranked content or does anyone have other ideas? Thank you all in advance!
Reporting & Analytics | | EconostoNL0 -
Open internal links in a new tab increase bonus rate?
Hello! This week I used a simple method to reduce my blog Google Analytics bounce rate. My blog all the posts are guides, in order to follow them, user need to download a zip file (same zip file). Otherwise they can't. Therefore I added a separate blog post to download all the necessary files. As a result of that I can reduce my bounce rate from 62-70% to 45-50% level. Now I'm thinking to open that zip file download page in a new tab. If I open my blog zip file download page, in a new tab. It will again increase my bounce rate? I reduced my bounce rate using that zip file download page. Thanks!
Reporting & Analytics | | Godad0 -
We lost great amount of Google Traffic. Need Expert Advice, Please!
We are in the business of selling home and commercial light fixtures for about 10 years now. Our website is a very large ecommerce website with more than 40K pages including category, sub-category and product pages. We have been getting decent organic traffic mostly from highly competitive keywords and also from product/solution specific long tail keywords. Recently when Google changed the EMD algo we have seen a dip in the traffic (say about 60%). I can't be so sure that this is because of EMD update, but it started happening only after this update. There has been a rank drop from 1st page to 2nd page, decrease in no. of keywords driving traffic, decrease in no. of pages driving traffic, all these things have an negative impact on our organic revenue. I know that our back link portfolio is bad and the reason behind this is the SEO companies that we previously worked with, Thanks to them for this sloppy work. Other than back links, Is there anything fundamentally wrong on our website. Here is the URL http://bit.ly/QVFHgr
Reporting & Analytics | | goldenageusa0 -
When I look at my SEOMOZ campaigns I see there are a lot of warnings in regards to missing Meta Tags Descriptions but they exist on a clien'ts wordpress site
when I look at my SEOMOZ campaigns I see there are a lot of warnings in regards to missing Meta Tags Descriptions but they exist on a clien'ts wordpress site
Reporting & Analytics | | Doug_Hay1 -
Google's New Privacy Policy and Analytics
Does anybody know if Google's new privacy policy allows it to use data gathered by Analytics to be used as a ranking factor in the SERPs?
Reporting & Analytics | | Jolora0