Landing Page or Doorway ?- that is the question!
-
Hi Guys,
So, I'm looking at a project to build a series of landing pages that cross map cities with Suname. E.g. Sydney + Smyth, New York + Fitzpatrick. On those pages I'll pull in from our directory relevant name based listings and try and display some other tailored / information. The page itself is the end goal - it is definitely not a doorway in the classic sense of encouraging someone to then go on the main site. I want the user to fill out a form on this page because they realise they've landed on a valuable service.
I'm looking at potentially 500 names against 2000 locations, creating 1,000,000 landing pages. Although some of the content will be repetitive I genuinely believe someone doing the appropriate search and finding our page will derive value from our page as our whole business is designed to answer their needs. However I'm worried that Google may classify these pages as doorway pages. Could anyone please shine the light of experience on this for me?
Thanks!
-
Thanks EGOL. Good practial advice! I'll take it into consideration. I guess I just reckon that this approach will cover the long tail to a marginal extent and I can ramp up my efforts in a more hands on way for the main traffic searches.
-
I used to have some big sites made with databases. The challenge with sites of 100,000 pages or 1,000,000 pages is having enough PR to get then into the index and hold them in the index.
If you have a big site you need to have a regular flow of spiders into and out of every page or google will forget them and they will fall out of the index. Even if they stay in the index they will be poorly competitive.
One of my sites had about 150,000 pages and to get and hold it in the index took a lot of links. Today I bet that you would need at least 30 PR5 links that connected deeply into the site to maintain an adequate flow of spiders for healthy indexing for a site of that size. A site with a million pages is going to require a lot more.
So, in addition to creating this site you are going to need some powerful link sources for indexing.
-
Hi EGOL,
Thanks for the swift response. That makes a lot of sense and correlates to how I thought Google approached this issue. I guess I want to just suck it and see if Google will de-index the pages - I am aware that there will not be vastly unique content on all of them but on a steadily increasing percentage. What I am very worried about is being permanently punished if they feel I'm doing something very wrong, or having my entire site downgraded (as I have heard can happen for doorway pages) or having those pages with poor content not be reindexable in 4 months time when there is better content on them.
Do you have any thoughts on these concerns.
Thanks
-
Google crawlers are unable to determine if you are offering a "valuable service".
However, if you have a program crank out a million pages that all look like this.....
"yada yada yada LOCATION yada yada yada SURNAME yada yada yada yada"
"yada yada yada LOCATION yada yada yada SURNAME yada yada yada yada"
"yada yada yada LOCATION yada yada yada SURNAME yada yada yada yada"
.... Google will probably index them and display them in the SERPs for a few days.... (and you might make a little money)... but then you will see the number of pages in their index decline rapidly as they figure out that these are cookiie-cutter-pages and toss them out as duplicate content.
However, if you have the resources to write a million pages of absolutely unique, non-duplicating content then you might make a lot of money from this site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt file in Shopify - Collection and Product Page Crawling Issue
Hi, I am working on one big eCommerce store which have more then 1000 Product. we just moved platform WP to Shopify getting noindex issue. when i check robots.txt i found below code which is very confusing for me. **I am not getting meaning of below tags.** Disallow: /collections/+ Disallow: /collections/%2B Disallow: /collections/%2b Disallow: /blogs/+ Disallow: /blogs/%2B Disallow: /blogs/%2b I can understand that my robots.txt disallows SEs to crawling and indexing my all product pages. ( collection/*+* ) Is this the query which is affecting the indexing product pages? Please explain me how this robots.txt work in shopify and once my page crawl and index by google.com then what is use of Disallow: Thanks.
White Hat / Black Hat SEO | | HuptechWebseo0 -
Google suddenly stops ranking a page for a "keyword" with same "keyword" in title tag. Low competition.
Hi all, We have released our next version of product called like "software 11", which have thousands of searches every month. So we have just added this same keyword "software 11" as page title suffix to one of the top ranking pages. Obviously this is the page has been added suddenly with "software 11" at page title, multiple header tags and 1 mention in paragraph. Google ranked it for 2 days and suddenly stopped showing this page in entire results for the same keyword we optimised the page for. Why does it happened? Does Google think that we are overdoing with this page and ignoring it? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Duplicate content warning: Same page but different urls???
Hi guys i have a friend of mine who has a site i noticed once tested with moz that there are 80 duplicate content warnings, for instance Page 1 is http://yourdigitalfile.com/signing-documents.html the warning page is http://www.yourdigitalfile.com/signing-documents.html another example Page 1 http://www.yourdigitalfile.com/ same second page http://yourdigitalfile.com i noticed that the whole website is like the nealry every page has another version in a different url?, any ideas why they dev would do this, also the pages that have received the warnings are not redirected to the newer pages you can go to either one??? thanks very much
White Hat / Black Hat SEO | | ydf0 -
How do I optimize pages for content that changes everyday?
Hi Guys I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend. However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page. As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes? How can I optimize the Title Tags and Meta Tags for pages that are constantly changing? I'm really stuck on this one and would appreciate some feedback into this tricky beast. Thanks in advance
White Hat / Black Hat SEO | | edward-may0 -
Victim of Negative SEO - Can I Redirect the Attacked Page to an External Site?
My site has been a victim of Negative SEO. During the course of 3 weeks, I have received over 3000 new backlinks from 200 referring domains (based on Ahref report). All links are pointing to just 1 page (all other pages within the site are unaffected). I have already disavowed as many links as possible from Ahref report, but is that all I can do? What if I continue to receive bad backlinks? I'm thinking of permanently redirecting the affected page to an external website (a dummy site), and hope that all the juice from the bad backlinks will be transferred to that site. Do you think this would be a good practice? I don't care much about keeping the affected page on my site, but I want to make sure the bad backlinks don't affect the entire site. The bad backlinks started to come in around 3 weeks ago and the rankings haven't been affected yet. The backlinks are targeting one single keyword and are mostly comment backlinks and trackbacks. Would appreciate any suggestions 🙂 Howard
White Hat / Black Hat SEO | | howardd0 -
Getting a link from an internal page with PR 2 of a domain with PR 5 is how much effective?
My website got a link from an internal page with PR rank of 2 but the domain has the PR rank 5. For example - A domain www.example.com with PR rank 5 and internal page www.example.com/extra/1 PR rank 2. I got a link from the internal page, will I benefit from main domain Page rank 5? Thanks, Sameer
White Hat / Black Hat SEO | | KaylaKerr0 -
Finding out why Bing gave page-level penalty?
In the last couple of weeks Bing has gradually removed 5 webpages of my website from their SERP's. The URL's are totally gone. They all had top 5 rankings and just got removed out of nothing. Have can I investigate what went wrong with these pages? Are here perhaps experts who are willing to investigate this for a fee? How can I restore a page-level penalty? I have no messages in my Bing Webmastertools account.
White Hat / Black Hat SEO | | wellnesswooz0 -
Google Results Pages. after the bomb
So, ever since Google "went nuclear" a few weeks ago I have seen major fluctuations in search engine results pages. Basically what I am seeing is a settling down and RE-inclusion of some of my web properties. Basically I had a client affected by the hack job initially, but about a week later I not only saw my original ranking restored but a litany of other long tails appeared. I wasn't using any shady link techniques but did have considerable article distribution that MAY have connected me inadvertently to some of the "bad neighborhoods." The website itself is a great site with original relevant content, so if it is possible, Google definitely recognized some error in their destructive ranking adjustment and is making good on it for those sites that did not deserve the penalty. Alternatively, it could just be random Google reordering and I got lucky. What are your experiences with the updates?
White Hat / Black Hat SEO | | TheGrid0