- Truth ? ''link building isn't considered a suitable way of promotion as per recent search engine updates''
-
I need SEO.
A SEO consultant said:
''link building isn't considered a suitable way of promotion as per recent search engine updates''
they mention:
''Therefore we would be undertaking a range
of promotional exercises such as blog postings, social book marking, press release, etc that are more effective for ensuring best
possible rankings for the website.''
Do you agree?
Thank you
-
This is a strange thing for an SEO company to say, because it's still widely accepted that good links pass PageRank / authority and add to the likelihood that a site will rank well for its chosen keywords. "Link building" itself might not be a term that fills Google with joy, but it's still a valuable way to improve a site's rankings and Google knows it. The truth lies somewhere in the middle if you want to stay white-hat: you need links, but you should acquire them in a way that does not constitute buying or being manipulative (again, if you want to stay to the letter of Google's law).
The activities they mention are fairly standard "link building" tactics though, and something like "social bookmarking" was a type of link building that lost popularity due to being ineffective nearly 10 years ago. I would have doubts about their SEO chops if they cite both link building as being out of bounds, and social bookmarking as being a good tactic!
-
I'd want a few more details. Most blog posts and press releases have links in them. Social bookmarking is all about links. Google has also gone against press releases just for links, as well as blog posts just for links.
-
I think it depends. Obviously having solid content is the best way to gain a natural following thus increase your ranking. It has been mentioned recently that small businesses should not look for un-natural web links and focus more on creating good content. The reasoning behind it is to eliminate any black-hat link techniques of the past once and for all. I don't buy into the idea that good solid links to good solid content won't help you. I think the future of search will probably be creating a natural mix of all of the above.
To answer your question, I agree and disagree. It's not like good links are going to hurt you. However, it should be more important to develop the content to get those links in a natural way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Can't support IE 7,8,9, 10\. Can we redirect them to another page that's optimized for those browsers so that we can have our site work on modern browers while still providing a destination of IE browsers?
Hi, Our site can't support IE 7,8,9, 10. Can we redirect them to another page that's optimized for those browsers so that we can have our site work on modern broswers while still providing a destination of IE browsers? Would their be an SEO penalty? Thanks!
Intermediate & Advanced SEO | | dspete0 -
How to Evaluate Original Domain Authority vs. Recent 'HTTPS' Duplicate for Potential Domain Migration?
Hello Everyone, So our site has used ‘http’ for the domain since the start. Everything has been set up for this structure and Google is only indexing these pages. Just recently a second version was created on ‘httpS’. We know having both up is the worst case scenario but now that both are up is it worth just switching over or would the original domain authority warrant just keeping it on ‘http’ and redirecting the ‘httpS’ version? Assuming speed and other elements wouldn’t be an issue and it's done correctly. Our thought was if we could do this quickly it would be easier to just redirect the ‘httpS’ version but was not sure if the Pros of ‘httpS’ would be worth the resources. Any help or insight would be appreciated. Please let us know if there are any further details we could provide that might help. Looking forward to hearing from all of you! Thank you in advance for the help. Best,
Intermediate & Advanced SEO | | Ben-R1 -
Why isn't Google caching our pages?
Hi everyone, We have a new content marketing site that allows anyone to publish checklists. Each checklist is being indexed by Google, but Google is not storing a cached version of any of our checklists. Here's an example:
Intermediate & Advanced SEO | | Checkli
https://www.checkli.com/checklists/ggc/a-girls-guide-to-a-weekend-in-south-beach Missing Cache:
https://webcache.googleusercontent.com/search?q=cache:DfFNPP6WBhsJ:https://www.checkli.com/checklists/ggc/a-girls-guide-to-a-weekend-in-south-beach+&cd=1&hl=en&ct=clnk&gl=us Why is this happening? How do we fix it? Is this hurting the SEO of our website.0 -
Do search engine consider this duplicate or thin content?
I operate an eCommerce site selling various equipment. We get product descriptions and various info from the manufacturer's websites offered to the dealers. Part of that info is in the form of User Guides and Operational Manuals downloaded in pdf format written by the manufacturer, then uploaded to our site. Also we embed and link to videos that are hosted on the manufacturer's respective YouTube or Vimeo channels. This is useful content for our customers.
Intermediate & Advanced SEO | | MichaelFactor
My questions are: Does this type of content help our site by offering useful info, or does it hurt our SEO due to it being thin and or duplicate content? Or does the original content publishers get all the benefit? Is there any benefit to us publishing this stuff? What exactly is considered "thin content"?0 -
Sites still rank who don't seem like they should. Why?
So you've been MOZing and SEOing for years and we're all convinced of the 10x factor when it comes to content and ranking for certain search terms... right? So what do you do when some older sites that don't even produce content dominate the first page of a very important search term? They're home pages with very little content and have clearly all dabbled in pre Panda SEO. Surely people are still seeing this and wondering why?
Intermediate & Advanced SEO | | wearehappymedia0 -
Search engine simulators are not finding text on my website. Do I have a problem with Javascript or AJAX?
My website text is not appearing in search engine simulators. Is there a problem with the javascript? Or perhaps AJAX is affecting it? Is there a tool I can use to examine how my website architecture is affecting how the site is crawled? I am totally lost. Help!
Intermediate & Advanced SEO | | ecigseo0 -
Does Disallowing a directory also tell search engines to unindex?
I have a bunch of duplicate pages/duplicate title issues because of Joomla's item/category/menu structures. I want to tell search engines not to crawl, and also to unindex anything in those directories in order to solve the duplicate issues. I thought of disallowing in robots.txt, but then I realized that might not remove the URLs if they've already been indexed. Please help me figure this out.
Intermediate & Advanced SEO | | Ocularis0