My competitors are using blackhat. What should i do.?
-
My competitors are using on page black hat methods
They are using like
keyword stuffing
What should i do.?
-
-
MP3 downloads aren't competitive? I would have thought they were quite competitive and there being fairly cutthroat competition.
-
concentrate on your on-page SEO, try to get UGC on your website, since it should match well with the industry you are in and in few months you will see a difference!
-
According to you what should i do.??
-
When you in a such low volume or non competitive industry, is important to know that Google may allow someone with low quality even someone using grey hat SEO to rank just because that someone maybe better than the rest..
-
Thanks for your reply. How can i contact you as i can't post urls here.? (New to moz)
-
Yes i am in non competitive industy( mp3 download).
-
I reported them but nothing happens (i reported a week ago)
-
having gone that road, never mess with black hat. Keep it nice and clean, work on your website, improve your on-page optimization, report them to be on the safe side, and wait...
the worse thing someone could do is fight fire with fire. Let them burn and stay clear. The only issue is if you are in a non competitive industry.. Is that the case?
-
My guess is there are probably other factors causing their ranking over you. Keyword stuffing in divs is so 2001,,,
You can report them as RangeMaketing suggested, sure. I would review other on-page areas to see if there are ways you can improve on the basics. I would also see if there are any links passing link equity that you can acquire that match their profile. If your page is better across the board for SEO than theirs, you have more PA/DA, etc., you may want to consider getting some co-citations to appear as an equal to them PLUS your better on-page metrics.
Hope this helps.
-
Best you can do is report the website for 'cloaking' via Webmaster Tools:
-
yes it is working for them but i don't want google to penalize my site after some time.
-
Is it working for them? If so, see if it works for you.
There is no white hat/black hat. There are only differing levels of risk tolerance. And risky SEO does pay off when done correctly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Schema would a Web design/development/seo ageny use and what is the schema.org link?
What Schema would a Web design/development/SEO Ageny use, and what is the schema.org link? I cannot for the life of me figure it out. ProfessionalService has been deprecated.
On-Page Optimization | | TiagoPedreira0 -
I am trying to use Page Optimization feature but it is giving me error.
Hi, I am trying to track a page optimization feature for one of my project, https://www.360degreespropertyinspections.com.au for keyword: property inspections melbourne but i keep getting this below error: "Page Optimization Error There was a problem loading this page. Please make sure the page is loading properly and that our user-agent, rogerbot, is not blocked from accessing this page." I checked robots.txt file, it all looks fine. Not sure what is the problem? Is it a problem with Moz or the website?
On-Page Optimization | | Abhijay191 -
Use of '&' in meta title
Hi, I know that use of '&' would be helpful to save space and also add more keyword variation to the title tag. But just want to make sure if it matters if I use '&' in most of my title tags? And also is it common to use more than & in one title? Would the following title be different in Google's perspective regardless of the title length? I am thinking they are all targeting the keywords 'fruit cake' and 'fruit bread', but the first one is the best. buy fruit cake & bread buy fruit cake & fruit bread buy fruit cake and fruit bread Thanks in advance.
On-Page Optimization | | russellbrown0 -
Using a subdomain to improve rankings
I have a pretty simple landing page that at present targets several keywords. Is a good strategy to add a blog to the site on a subdomain with pages targeting individual keywords? Will this help the main domain rank? Also is the best strategy to focus on building links to the home page or individual pages on the subdomain? As long as each page on the subdomain links back to the home page it should pass link juice back to the home page shouldn't it? Thanks in advance!
On-Page Optimization | | SamCUK0 -
Can I use the same text in my meta description as I put in my post excerpt?
Hi, I'm just trying to understand the right way to optimise my blog posts and this is likely a dumb question... but to what extent should the text in my meta description differ from the text in my post excerpts? cheers, Andrew
On-Page Optimization | | seowhiskey0 -
Should I use www in my url when running On-Page Report Card?
When creating a On-Page Report Card I get 2 different results when using a WWW and without for my url. What is best?
On-Page Optimization | | thomas.wittine0 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0 -
Use 301 redirects when deleting old products?
I'm removing old products (wines) from my site, and I've been using 301 redirects for each product page back to the winery page. My question is, am I using best practice? I want people who search for these now nonexistent products to go to the winery page where they will see what is now available. But does google approve? I've also tried leaving the product's page intact but saying that it is no longer available and putting a link in the text that points to the winery page. Which is better, in the eyes of the god google? Thanks!
On-Page Optimization | | JeanYates0