How to use my time: Make my site bigger or make link wheels?
-
I have a site which consist of about 500 pages. It's the biggest of it's tiny niche, and I'm making a livin' out of it (it gets me clients). So this is important to me.
I have access to tons and tons of non-copyrighted relevant texts. This text is not on the www, and thus would be unique to google. All though the text is relevant, it's not really useful for my visitors.
How to use this text and get the most of my time spent?
1. Making thousands of articles on my website, with internal linking to the "selling" keyword pages?
2. Use text to make a lot of link wheels - eventually linking to my main site?
Thanx a bunch! And if you have other suggestions I'd love to hear'em out
-
great response, I love the....
_"I would give these texts to my competitor and hope that he puts the dead weight on his site." _
You make a very valid point.
Content needs to be engaging, and if the content is just that, you stand a good chance of uses sharing your content.
shares = exposure = more visits = more sales
-
All though the text is relevant, it's not really useful for my visitors.
What's the goal here?
Are you trying to get useless traffic or weight down your website with a bunch of crap?
I would give these texts to my competitor and hope that he puts the dead weight on his site.
I believe that a small compact site with highly relevant pages competes better than a huge site with a ton of irrelevant crap.
I want my sites to be lean, mean fighting machines... not fat, out-of-shape wimps.
..............................
You asked about linkwheels. Don't do it.
-
Unless the content can be re-written into something useful, I wouldn't bother publishing it. It will do you no favours to have content that no-one wants to read. You want visitors to be excited to know you have produced something, not groan and look away
-
How would you make use of the content, then?
-
You do have to be careful of what you write and make sure it is of interest. If you just suddenly publish a load of articles (well, you would stage them anyway) that no-one really wants to read, as already said, you will get a high number of bounces and very little time spent on the pages / site.
Unique content that is applicable to your audience is the way to go
Andy
-- Oh, and steer clear of Link-Wheels. Old tricks that you will get nothing from any more. Google want to see content, content and more content
-
Oh, I never really thought of bounce rate..
It's outdated documents regarding my niche. Let's say you have a site about "How to climb mountains the modern way", then my documents would be stories of old-school mountain climbing. Relevant, but useless and uninteresting.
-
Can you elaborate on your "unique but useless" (my paraphrase) content? Normally I'd say to get as much content as you can out there but if it's not useful it might drive another signal you don't want: bounce rate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
What is future of Link building ? Any link building experts Here ?
Hey Everyone, its Muhammad Umair Ghufran I have one question about Link Building ? As my Knowledge Google Love the Quality content but Link building rank some low quality website Right ? So, what is the future of link building ; please explain indeep with complete reference for better understanding Thanks Regards: Muhammad Umair Ghufran
Intermediate & Advanced SEO | | muhammadumairghufran0 -
When Mobile and Desktop sites have the same page URLs, how should I handle the 'View Desktop Site' link on a mobile site to ensure a smooth crawl?
We're about to roll out a mobile site. The mobile and desktop URLs are the same. User Agent determines whether you see the desktop or mobile version of the site. At the bottom of the page is a 'View Desktop Site' link that will present the desktop version of the site to mobile user agents when clicked. I'm concerned that when the mobile crawler crawls our site it will crawl both our entire mobile site, then click 'View Desktop Site' and crawl our entire desktop site as well. Since mobile and desktop URLs are the same, the mobile crawler will end up crawling both mobile and desktop versions of each URL. Any tips on what we can do to make sure the mobile crawler either doesn't access the desktop site, or that we can let it know what is the mobile version of the page? We could simply not show the 'View Desktop Site' to the mobile crawler, but I'm interested to hear if others have encountered this issue and have any other recommended ways for handling it. Thanks!
Intermediate & Advanced SEO | | merch_zzounds0 -
Should I remove all vendor links (link farm concerns)?
I have a web site that has been around for a long time. The industry we serve includes many, many small vendors and - back in the day - we decided to allow those vendors to submit their details, including a link to their own web site, for inclusion on our pages. These vendor listings were presented in location (state) pages as well as more granular pages within our industry (we called them "topics). I don't think it's important any more but 100% of the vendors listed were submitted by the vendors themselves, rather than us "hunting down" links for inclusion or automating this in any way. Some of the vendors (I'd guess maybe 10-15%) link back to us but many of these sites are mom-and-pop sites and would have extremely low authority. Today the list of vendors is in the thousands (US only). But the database is old and not maintained in any meaningful way. We have many broken links and I believe, rightly or wrongly, we are considered a link farm by the search engines. The pages on which these vendors are listed use dynamic URLs of the form: \vendors<state>-<topic>. The combination of states and topics means we have hundreds of these pages and they thus form a significant percentage of our pages. And they are garbage 🙂 So, not good.</topic></state> We understand that this model is broken. Our plan is to simply remove these pages (with the list of vendors) from our site. That's a simple fix but I want to be sure we're not doing anything wring here, from an SEO perspective. Is this as simple as that - just removing these page? How much effort should I put into redirecting (301) these removed URLs? For example, I could spend effort making sure that \vendors\California- <topic>(and for all states) goes to a general "topic" page (which still has relevance, but won't have any vendors listed)</topic> I know there is no distinct answer to this, but what expectation should I have about the impact of removing these pages? Would the removal of a large percentage of garbage pages (leaving much better content) be expected to be a major factor in SEO? Anyway, before I go down this path I thought I'd check here in case I miss something. Thoughts?
Intermediate & Advanced SEO | | MarkWill0 -
How to make an AJAX site crawlable when PushState and #! can't be used?
Dear Mozzers, Does anyone know a solution to make an AJAX site crawlable if: 1. You can't make use of #! (with HTML snapshots) due to tracking in Analytics 2. PushState can't be implemented Could it be a solution to create two versions of each page (one without #!, so campaigns can be tracked in Analytics & one with #! which will be presented to Google)? Or is there another magical solution that works as well? Any input or advice is highly appreciated! Kind regards, Peter
Intermediate & Advanced SEO | | ConversionMob0 -
Canonical links apparently not used by google
hi, I do have an ecommerce website (www.soundcreation.ro) which in the last 3 months had a drop in the SERP. Started to look around in GWT what is happening. Google is reporting a lot of duplicate meta-tags (and meta-titles problem). But 99% of them had already canonical links setted. I tried to optimize my product listings with the new "prev", "next" tags and introduced also the "view-all" canonical link to help Google identify the appropiate product listing pages. SeoMoz is not reporting thos duplicate meta issues. Here is an example of the same page with different links, but with the same common canonical and reported by GWT "duplicate title tag": http://www.soundcreation.ro/chitare-chitari-electroacustice-cid10-pageall/http://www.soundcreation.ro/chitare-chitari-electroacustice-cid10/http://www.soundcreation.ro/chitare-chitari-electroacustice-cid10_999/http://www.soundcreation.ro/chitare-electro-acustice-cid10_1510/What could be the issue?- only that gwt is not refreshing as should be, keeping old errors?- if so, then there is an other serious issue because of why our PR is dropping on several pages?- do we have other problem with the site, which ends up with google penalizing us? Thank you for your ideas!
Intermediate & Advanced SEO | | bjutas0 -
Multiple sites - ownership & link structure
Hi All I am in the process of creating a number of sites within the garden products sector; each site will have unique, original content and there will be no cross over. So for example I will have one on lawn mowers, one on greenhouses, another on garden furniture etc. My original thinking was to create a single limited company that would own each of the domains, therefore all the registrant details will be identical. Is this a sensible thing to do? (I want to be totally white hat) And what, if any, are the linking opportunities between each of the sites? (16 in total). Not to increase ranking, more from an authoritative perspective. And finally, how should I link between each site? Should I no follow the links? Should I use keyword contextual links? Any advice ideas would be appreciated 🙂 Please note: It has been suggested that I just create one BIG site. I've decided against this as I want to use the keyword for each website in the domain name as I believe this still has value. Thanks
Intermediate & Advanced SEO | | danielparry0 -
Sitewide Header Link to Sister Site
Hi, I've just added a sitewide header image link (60 pages) from our general company site pointing to homepage of another brand site of a product that we also own (which focuses in depth on that one brand). I haven't put a nofollow on it as it's just info for those who'd like to reach our other site. Should I expect anything negative out of this for either site? Could it seem like it picked up 60 image links suddenly and raise a flag?
Intermediate & Advanced SEO | | emerald0