Changes to SEO with disavow?
-
Has the game changed a lot with the disavow tool I can see people still saying check out what our competitors are doing but with just going through a disavow myself how do you actually know what the correct link diversity is as 0 - 100% of the links could be disavowed.
Also could a competitor not just buy a load of spammy links and disavow them to mask there real links. (I know in my backlinks on 150 are good and the rest is disavowed crap)
-
Would you buy a bunch of spammy links and point them at your own site just so you could disavow them and try to fool your competitors? Not likely because it's hard to fool anyone who's knowledgeable about links about the quality of the links they're looking at when they actually visit the linking site/page and a strong competitor isn't going to chance a penalty on a technique such as that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is that trailing slashes necessity for an SEO doing blogs
Hi, I have a website, https://australiatimenow.com.au/ I would like to remove the trailing slash and move to .HTML formal. I have never done SEO on my articles. Is that, any issue causes if I move to .HTML format?
White Hat / Black Hat SEO | | joshnajenny0 -
Do home page carry more seo benefit than other pages?
hi, i would like to include my kws in the URL and they are under 50 characters. is there anything in the algo that tells engines to give more importance to homepage?
White Hat / Black Hat SEO | | alan-shultis0 -
Old subdomains - what to do SEO-wise?
Hello, I wanted the community's advice on how to handle old subdomains. We have https://www.yoursite.org. We also have two subdomains directly related to the main website: https://www.archive.yoursite.org and https://www.blog.yoursite.org. As these pages are not actively updated, they are triggering lots and lots of errors in the site crawl (missing meta descriptions, and much much more). We do not have particular intentions of keeping them up-to-date in terms of SEO. What do you guys think is the best option of handling these? I considered de-indexing, but content of these page is still relevant and may be useful - yet it is not up to date and it will never be anymore. Many thanks in advance.
White Hat / Black Hat SEO | | e.wel0 -
Wanna see Negative SEO?
One of my clients got hit with negative SEO in the past few days. Check it out in ahrefs. The site is www.thesandiegocriminallawyer.com. Any advice on what, if anything, I should do? Google disavow? Thanks.
White Hat / Black Hat SEO | | mrodriguez14401 -
Asynchronous loading of product prices bad for SEO?
We are currently looking into improving our TTFB on our ecommerce site. A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched. The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB. My question is whether google considers this as black hat SEO or not?
White Hat / Black Hat SEO | | jef22200 -
Press Release SEO. More than 90 press released in prweb. How good for seo
One of a competition sites known to us has published 90 press releases of all upcoming car models with PRWEB, PR7, DA 97 in last 4 months. All with different anchor based keywords and links. The page authority too is very high in some of them around 70 Though they might have had spent a sum, does this really influence from SEO perspective. If yes, can 10 press releases with PR Newswire which has PR8 and DA as 95 can be good if we consider doing this with all unique anchor text & links
White Hat / Black Hat SEO | | Modi0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Vendor Descriptions for SEO... Troublesome?
Howdy! I have been tossing this idea around in my head over the weekend and I cannot decide which answer is correct, so here I am! We a retailer of products and is currently in the midst of redesigning our site-- not only design but also content. The issue that we are facing is with product descriptions from our vendors. We are able to access the product descriptions/specs from their websites and use them on ours, but my worry is that we will get tagged for duplicate content. Other retailers (as well as the vendors) are using this content as well, so I don't want this to have an adverse effect on our ranking. There are so many products that it would be a large feat to re-write unique content-- not to mention that the majority of the rhetoric would be extremely similar. What have you seen in your experiences in similar situations? Is it bad to use the descriptions? Or do we need to bite the bullet and do our best to re-write hundreds of product descriptions? Or is there a way to use the descriptions and tag it in a way that won't have Google penalize us? I originally thought that if we have enough other unique content on our site, that it shouldn't be as big of a deal, but then I realized how much of our site's structure is our actual products. Thanks in advance!
White Hat / Black Hat SEO | | jpretz0