How can i make my sub-domain bring in more traffic?
-
we have a social networking site that has user-generated content and users have to log in before they can access the platform.
We are in the process of creating indexable pages that google bots can pick up. These pages will sit in front of the log in pages and will be open for guest access.
Can you advise on the best way of optimising these pages for higher ranking and traffic.
-
A good place to start is identifying the search volume you are looking at targeting with the Google Adwords tool.
From there you can identify the competition of the sites currently ranking and put together an optimisation strategy for your content.
The point is you need to identify what you want to rank for first, then it is the age-old; good quality engaging content and building backlinks from authority sources.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can service request pages be indexed for a service site?
I think there is no point in indexing service request pages for a service site. And it causes the indexing of the main pages to be done with a delay. Does anyone have experience with indexing service request pages and their results?
On-Page Optimization | | sora.ya04680 -
My site auto redirects http to https. This is causing redirect chains. What can I do?
I noticed that Moz flags a lot of redirect chain issues on my site. I realized that this is mostly because the site automatically redirects http to https, and when I create a new URL (when a URL changes, for example) it is automatically flagged as a chain. Example: http://www.example-link Auto directs to: https://www.example-link Which is then redirected to: https://www.example-link-changed (when the address actually changes) I don't seem to have any control over changing where the initial http redirect goes. Any advice on fixing this problem?
On-Page Optimization | | baystatemarketing0 -
Lost SEO contract, new SEO wants us to do the following - can you explain why?
1. Make prokem.co.uk the master domain rather than prokem-corrosion-protection.com 2. Ensure each http URL is 301 redirected to its https counterpart via htaccess rather than in plesk 3. 301 redirect each www.prokem-corrosion-protection.com URL to its co.uk counterpart via htaccess. I can provide a list of pages to redirect as there are a number of duplicate pages that will need removing. It probably makes sense to implement these other changes at the same time: Remove all of the canonical tags currently on the site. Leverage browser caching by following Google’s page speed recommendations - https://developers.google.com/speed/docs/insights/LeverageBrowserCaching Losslessly compress all of the website’s images. Combine and minify the website’s JavaScript
On-Page Optimization | | Simon_VO0 -
How i can make this page more valuable from SEO perspective ???? any suggestion ?
Hi experts , how i can improve following page . please share your thoughts. https://www.protoexpress.com/content/quicklinks.jsp Thanks
On-Page Optimization | | SierraPCB0 -
Why would changing 404 pages increase traffic by 9%?
Neil Patel claimed in this article that by creating a custom 404 page that links out to 25 to 50 random internal pages on the website, he was able to increase the traffic of Techcrunch by 9%. I'm a bit skeptical about this claim. A couple of questions: Is this theory sound? If you've personally tried this or have read other articles supporting Neil, I'd love to learn more. Would a big site like Techcrunch really have problems with Google not indexing all of its pages? Also, does getting more pages crawled help you get more traffic? Specifically, would it help a site like mine? For reference, my site gets an average of 12,040 pages crawled per day in last 90 days. Currently 28,922 pages have been indexed. Are there any possible downsides to trying this? Thanks!
On-Page Optimization | | Brand_Psychic0 -
Agency Domain Authority Boosting Activity
Hi Guys Have been reading up a bit on methods for boosting Domain Authority and am generally finding that the best way is by producing unique and relevant content through blogs and other kinds of articles. Having multiple clients in an agency means that there is limited time for this and I need something else to assist in boosting Domain Authority. I perform a fair bit of backlinking through online directories, however I am also finding that most blog comment sections have implemented 'no follow' codes to reduce spam content. There are plenty of free online directories, and many with high Domain Authorities, however they can take up to months for the listings to be approved. I am performing other activities to boost keyword rankings in Google for our clients but need some help with getting their Domain Authority up. Does anyone know of an efficient method for boosting Domain Authority for an agency with many clients where blog writing for each may not be a viable option? Would be great to hear anyone's ideas!
On-Page Optimization | | JuiceBoxOM0 -
Long Url but makes no sense
Hi Just joined. Crawl states that I am getting a lot of errors, looks like the spider is getting confused and looping back on itself ? Is there a way to see where the crawl was formulated (ie where from) ? It is generating urls like: http://www.wickman.net.au/wineauction/wine_auction_alert.aspx/auction/auction/auction/auction/auction/auction/Default.aspx from http://www.wickman.net.au/wineauction/wine_auction_alert.aspx
On-Page Optimization | | blinkybill0 -
How to make FB comments crawlable by Google? <noscript>?</noscript>
We get tons of FB comments, but it's all in iframe, so Google doesn't give us any credit for it. We found a solution - turn all the comments into HTML and hide it from readers with <noscript>. </p> <p> </p> <p>Will this help? I heard that Google considers <noscript> a scammy practice. Is that true?</p> <p>How do you guys make your FB comments SEO friendly?</p> <p> </p></noscript>
On-Page Optimization | | Alexey_mindvalley0