Social bookmarking, many or a few?
-
so I was wondering if you think a bit of software such as bookmarkingdemon (or similar) would be worth investment, or just a couple of "proper" accounts on digg, stumble, etc is a better route? As an agency would you create an account for each client, or just have one for your own agency and tag your clients within? Im not sure even how much impact it will have on SERPs?
thanks!
-
thanks for the reply, that was my thinking really - so its a case of registering almost a personal account, but then that links me to my company as well, as i dont want to be making up names, etc - bit of a tricky one really!
-
Hi Lawrence,
I definitely steer towards having a few accounts and doing things manually rather than relying on automation. You can certainly achieve volume with automation, but the number of times you are successful in making something popular is not going to be high. Of course, if you are simply going for volume and volume along, that is fine, but I also find that sort of activity has little impact on SERPs.
For true success in social media, you often have to maintain quality accounts that are neither tied to your client nor to your agency. People understandably dislike overt marketing on the social sites they go to for information and entertainment, so to be truly successful, your activity needs to be as genuine as you can make it. Again, this is assuming that you want high visibility on these sites, rather than just volume submissions.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does creating too many parent pages damage my website's SEO?
I need to know how to keep my website structure well organised and ensure Google still recognises the key pages. I work for a travel company which needs to give customers various pieces of information on our website and this needs to be well organised in terms of structure. For example, customers need information on airport pick-ups and drop-offs for each of our destinations but this isn't something that needs to rank on Google. Logically for site structure would be to create a parent page: thedragontrip.com/transfers/india Is creating parent pages for unimportant content a bad idea?
Intermediate & Advanced SEO | | nicolewretham1 -
Canonical's, Social Signals and Multi-Regional website.
Hi all, I have a website that is setup to target different countries by using subfolders. Example /aus/, /us/, /nz/. The homepage itself is just a landing page redirect to whichever country the user belongs to. Example somebody accesses https://domain/ and will be redirected to one of the country specific sub folders. The default subfolder is /us/, so all users will be redirected to it if their country has not been setup on the website. The content is mostly the same on each country site apart from localisation and in some case content specific to that country. I have set up each country sub folder as a separate site in Search Console and targeted /aus/ to AU users and /nz/ to NZ users. I've also left the /us/ version un-targeted to any specific geographical region. In addition to this I've also setup hreflang tags for each page on the site which links to the same content on the other country subfolder. I've target /aus/ and /nz/ to en-au and en-nz respectively and targeted /us/ to en-us and x-default as per various articles around the web. We generally advertise our links without a country code prefix, and the system will automatically redirect the user to the correct country when they hit that url. Example, somebody accesses https://domain/blog/my-post/, a 302 will be issues for https://domain/aus/blog/my-post/ or https://domain/us/blog/my-post/ etc.. The country-less links are advertised on Facebook and in all our marketing campaigns Overall, I feel our website is ranking quite poorly and I'm wondering if poor social signals are a part of it? We have a decent social following on Facebook (65k) and post regular blog posts to our Facebook page that tend to peek quite a bit of interest. I would have expected that this would contribute to our ranking at least somewhat? I am wondering whether the country-less link we advertise on Facebook would be causing Googlebot to ignore it as a social signal for the country specific pages on our website. Example Googlebot indexes https://domain/us/blog/my-post/ and looks for social signals for https://domain/us/blog/my-post/ specifically, however, it doesn't pick up anything because the campaign url we use is https://domain/blog/my-post/. If that is the case, I am wondering how I would fix that, to receive the appropriate social signals /us/blog/my-post/, /aus/blog/my-post/ & /nz/blog/my-post/. I am wondering if changing the canonical url to the country-less url of each page would improve my social signals and performance in the search engines overall. I would be interested to hear your feedback. Thanks
Intermediate & Advanced SEO | | destinyrescue0 -
Social Media Content Duplicacy?
Since, Social media signals are taken into account for SEO activities. Do you think that Social Profiles account content will also be taken into account? Or is it that how many fans, share, likes a profile has gets into account for SEO activities. Like more number of fans the higher the social value for SEO? Secondly, if someone re-tweets or shares. The content gets duplicated into number of profiles re-tweeted or shared? & what we even do is copy content from other pages, do slight changes & post on our pages? Do you think it will affect the SEO part?
Intermediate & Advanced SEO | | welcomecure0 -
Many pages small unique content vs 1 page with big content
Dear all, I am redesigning some areas of our website, eurasmus.com and we do not have clear what is the best
Intermediate & Advanced SEO | | Eurasmus.com
option to follow. In our site, we have a city area i.e: www.eurasmus.com/en/erasmus-sevilla which we are going
to redesign and a guide area where we explain about the city, etc...http://eurasmus.com/en/erasmus-sevilla/guide/
all with unique content. The thing is that at this point due to lack of resources, our guide is not really deep and we believe like this it does not
add extra value for users creating a page with 500 characters text for every area (transport...). It is not also really user friendly.
On the other hand, this pages, in long tail are getting some results though is not our keyword target (i.e. transport in sevilla)
our keyword target would be (erasmus sevilla). When redesigning the city, we have to choose between:
a)www.eurasmus.com/en/erasmus-sevilla -> with all the content one one page about 2500 characters unique.
b)www.eurasmus.com/en/erasmus-sevilla -> With better amount of content and a nice redesign but keeping
the guide pages. What would you choose? Let me know what you think. Thanks!0 -
How many redirects are too many?
Hello Everyone, I currently have a dynamic site and it is my understanding that switching to a static site would be beneficial. I already have some 301's in place from when my site had a .php extension to the new extension now with ./?... etc. Is it okay to re redirect them? How many redirects are too many? Thank you in advance for suggestions. Have a Fabulous Friday! Sandra
Intermediate & Advanced SEO | | rankmenow0 -
How many inner links on one page?
I have seen Matt Cutts video about links per page and know that too many links "may" harm the flow of link juice. But what should e-commerce sites do? We have category pages with more than a few thousands products in each of them. So linking to each of them dilutes the PR flow? We could use pagination, but doesn't it give a disadvantage in user experience when he needs to go 10 links deep to reach a product? And Google robots won't update the information frequently because it will be on the lowest part of our site? Now our goal is to make all our products appear like Facebook scroll down page. We know that Google doesn't use Ajax to see more links so robots and all the users that don't have JavaScript could see the paginated results. Is it a good way to put all products and links like this?
Intermediate & Advanced SEO | | komeksimas1 -
How to avoid too many "On Page Links"?
Hi everyone I don't seem to be able to keep big G off my back, even though I do not engage in any black hat or excessive optimization practices. Due to another unpleasant heavy SERP "fluctuation" I am in investigation mode yet again and want to take a closer look at one of the warnings within the SEOmoz dashboard, which is "Too many on page links". Looking at my statistics this is clearly the case. I wonder how you can even avoid that at times. I have a lot of information on my homepage that links out to subpages. I get the feeling that even the links within the roll-over menus (or dropdown) are counted. Of course, in that case then you will end up with a crazy amount of on page links. What about blog-like news entries on your homepage that link to other pages as well? And not to forget the links that result from the tags underneath a post? What am I trying to get at? Well, do you feel that a bad website template may cause this issue i.e. are the links from roll-over menus counted as links on the homepage even though they are not directly visible? I am not sure how to cut down on the issue as the sidebar modules are present on every page and thus up the links count wherever you are on the site. On another note, I've seen plenty of homepages with excessive information and links going out, would they be suffering from the search engines' hammer too? How do you manage the too many on page links issue? Many thanks for your input!
Intermediate & Advanced SEO | | Hermski0 -
Why are so many pages indexed?
We recently launched a new website and it doesn't consist of that many pages. When you do a "site:" search on Google, it shows 1,950 results. Obviously we don't want this to be happening. I have a feeling it's effecting our rankings. Is this just a straight up robots.txt problem? We addressed that a while ago and the number of results aren't going down. It's very possible that we still have it implemented incorrectly. What are we doing wrong and how do we start getting pages "un-indexed"?
Intermediate & Advanced SEO | | MichaelWeisbaum0