Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Leveraging A Second Site
Hi, A client of mine has an opportunity to buy/control another site in the same niche. The client's site is the top-ranked site for the niche. The second site is also often top half of page one. The second site has a 15 year old design that is a really bad, almost non-functional, user experience and thin content. The client's site (site 1) has the best link profile and dominates organic search, but the second site's link profile is as good as our nearest competitor's link profile. Both sites have been around forever. Both sites operate in the affiliate marketing space. The client's site is a multi million dollar enterprise. If the object were to wring the most ROI out of the second site, would you: A) Make the second site not much more than a link slave to the first, going through the trouble to keep everything separate, including owner, hosting, G/A, log-on IPs, so as not to devalue the links to 1st site, etc? Or... B) Develop the second site and not worry about hiding that both are the same owner. Or... C) Develop the second site and still worry about it keeping it all hidden from Google. Or... D) Buy the second site and forward the whole thing to site 1. I know the white hat answer is "B," but would like to hear considerations for these options and any others. Thanks! P.S., My pet peeve is folks who slam a fast/insufficient answer into an unanswered question, just to be the first. So, please don't.
White Hat / Black Hat SEO | | 945010 -
Are bloggs published on blog platforms and on our own site be considered duplicate content?
Hi, SEO wizards! My company has a company blog on Medium (https://blog.scratchmm.com/). Recently, we decided to move it to our own site to drive more traffic to our domain (https://scratchmm.com/blog/). We re-published all Medium blogs to our own website. If we keep the Medium blog posts, will this be considered duplicate content and will our website rankings we affected in any way? Thank you!
White Hat / Black Hat SEO | | Scratch_MM0 -
Duplicate content warning: Same page but different urls???
Hi guys i have a friend of mine who has a site i noticed once tested with moz that there are 80 duplicate content warnings, for instance Page 1 is http://yourdigitalfile.com/signing-documents.html the warning page is http://www.yourdigitalfile.com/signing-documents.html another example Page 1 http://www.yourdigitalfile.com/ same second page http://yourdigitalfile.com i noticed that the whole website is like the nealry every page has another version in a different url?, any ideas why they dev would do this, also the pages that have received the warnings are not redirected to the newer pages you can go to either one??? thanks very much
White Hat / Black Hat SEO | | ydf0 -
On Page #2 of Bing But Nowhere on Google. Please Help !
Hi, community. I have a problem with the ranking of my blog and I hope anyone could help me to solve this problem. I have been trying to rank my blog post for a keyword for almost 6 months but still getting no success. My URL is: this blog post
White Hat / Black Hat SEO | | Airsionquin
Target keyword: best laptops for college The interesting fact is that the post has been on page #2 of BING but nowhere on google. It was on page #3 of google for about one month, but it's been 1-2 weeks gone(not ranked anymore but it's still well indexed). The post has been replaced by another post of my blog(let's say post A) which doesn't have any link. The Post A is ranking on page #4 right now.
The weird thing is my post which ranks for this keyword frequently changes. One day the Post A was on page#4 then after a few days it changed to the post B. Yesterday I searched on google for a keyword "number one on bing but nowhere on google" and then I
come across to read this article on MOZ community and one of the people here said that it was over optimization issue. I think my post has been suffering for an over optimization penalty algorithm. Just for your information, I have been building backlinks to this URL for the last 5 months(it's 1+ year old). It has backlinks only about 1,5k from 200 domains(according to ahref). I have used the exact match anchor only under +/- 2%. The rest is branded, naked URL and generic anchors.
So, in this case, I thought that I haven't done any over anchor optimization.
I have checked the keyword density and I found it was "safe". One important thing I can remember before the post has gone is I add a backlink from lifehack.org(guest post) with exact match anchor.
I suspect this is really the cause because 2-3 days after doing that then the post is gone(dropped) and replaced by another post of my blog(as I've mentioned before). But it's very strange because the amount of the anchor keyword(including the long tail) is only about 10(from 200 domains) or only 5% which mean it should be safe. I'm so Sorry. It's a long story 🙂 So, What is actually happening to my post? and How to fix this problem... Please..please help me... Any hep is appreciated. By the way, Sorry for my poor english.. 🙂0 -
Best practice to preserve the link juice to internal pages from expired domain?
This question relates to setting up an expired domain, that already has quality links, including deep links to internal pages. Since the new site structure will be different, what's the best practice to preserve the link juice to these internal pages? Export all the internal pages linked to using majestic Seo/ ahrefs etc, and set these pages previously linked to? Or 301 redirect these pages to home page? I heard there's a Wordpress plugin that 301 redirects all the 404 errors successfully preserving all the potential link juice.
White Hat / Black Hat SEO | | adorninvitations0 -
Forcing Entire site to HTTPS
We have a Wordpress site and hope to force everything to HTTPS. We change the site name (in wordpress settings) to https://mydomain.com In the htaccess code = http://moz.com/blog/htaccess-file-snippets-for-seos Ensure we are using HTTPS version of the site. RewriteCond %{HTTPS} !on RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] but some blogs http://stackoverflow.com/questions/19168489/https-force-redirect-not-working-in-wordpress say RewriteCond %{HTTPS} off RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] Which one is right? 🙂 and are we missing anything?
White Hat / Black Hat SEO | | joony0 -
Publishing Press Releases after Google Panda 2.5
For the past few years I have been publish press releases on my site for a number of business. I have high traffic on my site. I noticed that with the Google Panda 2.5 update PRNewswire.com dropped visibility by 83%. Should I stay away from publishing press releases now? Does Google consider Press Releases to be "content scraping" since multiple sources are publishing the release?
White Hat / Black Hat SEO | | BeTheBoss2 -
Opinions Wanted: Links Can Get Your Site Penalized?
I'm sure by now a lot of you have had a chance to read the Let's Kill the "Bad Inbound Links Can Get Your Site Penalized" Myth over at SearchEngineJournal. When I initially read this article, I was happy. It was confirming something that I believed, and supporting a stance that SEOmoz has taken time and time again. The idea that bad links can only hurt via loss of link juice when they get devalued, but not from any sort of penalization, is indeed located in many articles across SEOmoz. Then I perused the comments section, and I was shocked and unsettled to see some industry names that I recognized were taking the opposite side of the issue. There seems to be a few different opinions: The SEOmoz opinion that bad links can't hurt except for when they get devalued. The idea that you wouldn't be penalized algorithmically, but a manual penalty is within the realm of possibility. The idea that both manual and algorithmic penalties were a factor. Now, I know that SEOmoz preaches a link building strategy that targets high quality back links, and so if you completely prescribe to the Moz method, you've got nothing to worry about. I don't want to hear those answers here - they're right, but they're missing the point. It would still be prudent to have a correct stance on this issue, and I'm wondering if we have that. What do you guys think? Does anybody have an opinion one way or the other? Does anyone have evidence of it being one way or another? Can we setup some kind of test, rank a keyword for an arbitrary term, and go to town blasting low quality links at it as a proof of concept? I'm curious to hear your responses.
White Hat / Black Hat SEO | | AnthonyMangia0