Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hello Everyone, I have Added This in .htaccess. Options +FollowSymlinks
    RewriteEngine on
    RewriteCond %{HTTP_HOST} ^domain.com$
    RewriteRule (.*) http://www.domain.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
    RewriteRule ^(.)index.html$ http://www.domain.com/$1 [R=301,L] ErrorDocument 404 /index.html Is this Correct ?? or need any change, please help, thanx in advace .

    | falguniinnovative
    0

  • I'm going through the crawl report and it says I've got duplicate pages. For example, blog/page/2 is the same as   author/admin/page/2/  Now, the author/admin/page/2 I can't even find in WordPress, but it is the same thing as blog/page/2 nonetheless. Is this something I should just ignore, or should I create the author/admin/page2 and then 301 redirect it to blog/page/2?

    | shift-inc
    0

  • Moz is picking up a large quantity of duplicate content, consists mainly of URL parameters like ,pricehigh & ,pricelow etc (for page sorting). Google has indexed a large number of the pages (not sure how many), not sure how many of them are ranking for search terms we need. I have added the parameters into Google Webmaster tools And set to 'let google decide', However Google still sees it as duplicate content. Is it a problem that we need to address? Or could it do more harm than good in trying to fix it? Has anyone had any experience? Thanks

    | seoman10
    0

  • Hello MOZ, We had an massive SEO drop in June due to unknown reasons and we have been trying to recover since then. I've just noticed this yesterday and I'm worried. See: http://imgur.com/xv2QgCQ Could anyone help by explaining what would cause this sudden drop and what does this drop translates to exactly? What is strange is that our index status is still strong at 310 pages, no drop there: http://imgur.com/a1sRAKo And when I do search on google site:globecar.com everything seems normal see: http://imgur.com/O7vPkqu Thanks,

    | GlobeCar
    0

  • Our developer has improved page speed, particularly for Mobile. However the price for this improvement has been a HTML validation error that cannot be removed without compromising on the page load speed. Is the improvement in speed worth the living with the validation error? The concern is paying a high SEO price for this "fatal error". Or perhaps this error is in fact not serious? Fatal Error: Cannot recover after last error. Any further errors will be ignored. From line 699, column 9; to line 699, column 319 >↩ ↩ `OUR DEVELOPER'S COMMENT: | This was made following Google page speed insights recommendations. If we remove that, will loose on page load performance | The domain URL is www.nyc-officespace-leader.com`

    | Kingalan1
    0

  • I recently had an seo consultant recommend using tags instead of h1/h2 tags for article titles on the homepage of my news website and category landing pages. I've only seen this done a handful of times on news/editorial websites. For example: http://www.muscleandfitness.com/ Can anyone weigh in on this?

    | blankslatedumbo
    0

  • Hi there, Some external developers have created a wishlist for a website that allows visitors to add products to a wishlist and then send an enquiry. Very similar set-up to a shopping basket really (without the payment option). However, this wishlist lives in a separate iframe and refreshes every 30 seconds to reflect any items visitors add to their wishlist. This refreshing is done with a meta refresh. I'm aware of the obvious usability issue that the visitor's product only appears after 30 seconds in their wishlist. However, are there also any SEO issues due to the refreshing of the iframe every 30 seconds? Please let me know, whether small or large issues.

    | Robbern
    0

  • Is it a good move to pick 10 free blogging sites to build links. Like drip feeding them. Let's say 10 blogging sites irrespective of its a sub-domain as we get in wordpress or a sub-folder blog as we get in livejournal. Now adding articles related to my money website on those blogs newly created & building links from them. Then drip feeding them by putting 1 article a month at regular intervals with anchor as links in each of them. Do you think its a good move?

    | welcomecure
    0

  • Hello Everyone, Can Any body help to suggest Good software, or Any other to easily Submit my website , to All Search Engines ? ? Any expert Can help please, Thanx in Advance

    | falguniinnovative
    0

  • I did some searching before asking but could not quite find what I was looking for. There are valid directories out there that provide business as well as links that provide SEO value. My question is whether or not having a redirect in place negates passing any link juice. When I use Open Site Explorer for Old Monterey Inn, this directory (CABBI) does not show up on their list. However, their website dropped from Google Analytics altogether for some time because of an issue in how they built their site. Their "fix" is this redirect which was integrated a short time ago. I do see traffic in Google Analytics now but wonder about the link juice. Example: <a href="[/redirect?type=website&amp;inn=34211&amp;url=http%3A%2F%2Fwww.oldmontereyinn.com](view-source:https://www.cabbi.com/redirect?type=website&inn=34211&url=http%3A%2F%2Fwww.oldmontereyinn.com)" target="<a class="attribute-value">_blank</a>">www.oldmontereyinn.coma>p> What say you? Thanks to anyone that responds.

    | ColoradoMarketingTeam
    0

  • We use link shortner services like Bitly, Goo.gl, etc. Does the post used while making use of such link shortner services counts as a social signal. Or should we post the complete website url pointing to each page while posting on social sites. Secondly, should we write a new description while posting on Social sites or just copy paste a few lines of original posts?

    | welcomecure
    0

  • Hello all, In 2013 I had an Unnatural Links Warning message in my GWT account. I believe that it was a result of the work of an old SEO company. When I received the warning I was working with an SEO. He helped me clean up some links. He also uploaded a disavow file for me. He did not file a reconsideration request. He told me that it was not necessary at the time. The message disappeared from my account. A few months ago a similar message appeared in the manual accounts section of my account. I gathered inbound links from GWT, Majestic, etc. I went through them myself and tried to contact lots and lots of webmasters. I got many links cleaned up. I spent several months on this project. I just logged into my Search Console account this afternoon and clicked through everything and guess what... that manual penalty message is gone. So... what does that mean? I assume that I should still upload the disavow file for the sites that did not respond to me that are spammy. Should I still try to file a reconsideration request even though there doesn't seem to be a manual penalty? How should I proceed? Thanks. Melissa

    | pajamalady
    0

  • Recently a module was built into the homepage to pull in content from an outside source via Ajax and I'm curious about the overall crawability of the content. In WMT, if I fetch & render the content it displays correctly, but if I view source all I am seeing is the empty container. Should I take additional steps so that the actual AJAX content displays in my source code, or am I "good" since the content does display correctly when I fetch & render?

    | RosemarieReed
    0

  • We've been suffering under a Panda penalty since Oct 2014. We've completely revamped the site but with this new "slow roll out" nonsense it's incredibly hard to know at what point you have to accept that you haven't done enough yet. We have thousands of news stories going back to 2001, some of which are probably thin and some of which are probably close to other news stories on the internet being articles based on press releases. I'm considering noindexing everything older than a year just in case, however, that seems a bit of overkill. The question is, if I mine the logfiles and only deindex stuff that Google sends no further traffic to after a year could this be seen as trying to game the algo or similar? Also, if the articles are noindexed but still exist, is that enough to escape a Panda penalty or does the page need to be physically gone?

    | AlfredPennyworth
    0

  • Hi, I recently started working with two products (one is community driven content), the other is editorial content, but I've seen a strange pattern in both of them. The Google Index constantly decreases week over week, for at least 1 year. Yes, the decrease increased 🙂 when the new Mobile version of Google came out, but it was still declining before that. Has it ever happened to you? How did you find out what was wrong? How did you solve it? What I want to do is take the sitemap and look for the urls in the index, to first determine which are the missing links. The problem though is that the sitemap is huge (6 M pages). Have you find out a solution on how to deal with such big index changes? Cheers, Andrei

    | andreib
    0

  • Hi All, we are looking for some guidance please, if at all possible. We have .com domain (the domain is older than 10 years), we have been using it for 2 years. We also have .com.au version of the domain (the domain is 2 years old, pointing to the .com domain) and isn't being used. We are an Australian based company. Our question is, should we be using .com.au instead of .com and if so, how would you advise going about doing the change over without having huge SEO impact on our business (negatively). We are on the home page for most of the searches we have optimized for, but we are always below the .com.au's - which is why we are considering the possibility of the move? Any advice would be GREATLY appreciated 🙂

    | creativeground
    0

  • We have a mobile website (mobile.website.com) that mirror our desktop site (www.website.com) with +100 000 pages. We have an alternate tag on our desktop to our mobile site and a user agent detect that redirect mobile traffic to our mobile site Our mobile site is no index and has a canonical to our desktop. Everything works pretty well, the mobile website is not index and only show up in SERP when a user make a search from a mobile. Our main website is now responsive and we would like to kill our mobile site without compromising our traffic. We know that a slight speed change or content change can affect our traffic, what would be the best way to do that? Big bang: redirect all mobile URL to desktop, remove user agent detect and remove alternate tag on desktop Semi Big bang: remove user agent detect and remove alternate tag on desktop and see how the traffic react before redirecting Progressive: remove the user agent detect and the alternate tag on some section of the website to see how the traffic react Other ? Anyone has any experience with that? Thanks and let me know if anything is not clear.

    | Digitics
    0

  • I ask because I see many of my competitors with irrelevant backlinks still ranking at the top, despite what SEO say about it. I have a Web Design business, most of my competitors have their backlinks from sites they built ie "website by _____" and thats how they rank at the top. Should I keep getting backlinks from my clients (with permission)? Also does anchor text on backlinks affect all keywords in your website? Example is if you provide multi services ie plastering and decorating. You have 100 links pointing to your site with anchor text of "plastering". Technically your site still has 100 backlinks, will that also help boost the onsite optimisation for the keyword "decorating"?

    | Marvellous
    0

  • Is it good to link external websites from every page. Since, the on-page grader shows there should be one link pointing to an external source. I have a website that can point to an external website from every page using the brand name of the specific site like deal sites do have. Is it worth having external link on every page, of-course with a no-follow tag?

    | welcomecure
    0

  • I've got a bit of an odd situation... My business partner and I split up, and he's going to keep the company name. The website that I built for the company has some links to it, and I've managed to build up some DA and PA. I want to get the link juice over to my new website. My former partner doesn't care about the link juice, he just wants a website that he can show people. SO, I can't do a 301 or 302, because that would take down the existing site. Can I just use a canonical tag that refers link power to my new website? Would this be harmful in any way? What should I do to accomplish getting the link power without a redirect, and without contacting each person who has given us a backlink?

    | Zing-Marketing
    0

  • Hi guys, Currently have duplicate pages accross a website e.g. https://archierose.com.au/shop/cart**#!** https://archierose.com.au/shop/cart The only difference is the URL 1 has a hashtag and exclamation tag. Everything else is the same. We were thinking of adding rel canonical tags on the #! versions of the page to the correct URLs. But Google doens't seem to be indexing the #! versions anyway. Does anyone know why this is the case? If Google is not indexing them, is there any point adding rel canonical tags? Cheers, Chris https://archierose.com.au/shop/cart#!

    | jayoliverwright
    0

  • I cannot tell if something is wrong with my domain redirect or if it is some default behavior by Chrome. I made a short video, I used Moz first since they are using an SSL certificate too. But the gist is immediately when I type in the address it flashes www, then redirects to the non www. domain. Because of how its happening so quick I cannot tell if this is just default Chrome behavior or if I have the DNS set up incorrectly. Here is the video. http://screencast.com/t/UDHmdTCbQv

    | LesleyPaone
    0

  • We have a long established business since 2004 and have been fortunate that having been one of the original companies in our industry, we have always enjoyed strong Google rankings. Unfortunately, these have been steadily declining over the past couple of years and a comparison of August to date against the equivalent period last year has seen a 20% drop in traffic from Google. We don't believe that it is being caused by a penalty and rather is the result of some strong players entering our market and tightening their focus which has caused us to take a dip in rankings. We are guilty of being complacent in our SEO - largely due to not knowing what to do and being scared to touch it when it was working in case we broke it! - but now it's time to fight back. We still have a strong site, good traffic levels and a strong product offering. We have knowledge of SEO and resources in house, but are not experts by any means. Our current plan is to: perform a technical site audit, fixing the issues highlighted by the Moz Pro Software put strong emphasis on our blog, writing daily about the latest news and events in our industry provide weekly content articles which are more in depth than the daily blog articles and which will be of interest to our community undertake surveys and publish infographics and statistics with the hope of being picked up in national newspapers Are there any key elements that we are missing out in this plan, or is that it in a nutshell?  Any help and advice is greatly appreciated.

    | simonukss
    0

  • Google has come a long way over the past 5 years, the quality updates have really helped bring top quality content to the top that is relevant for users search terms, although there is one really ANNOYING thing that still has not been fixed. Websites using brand name as service search term to manipulate Google I have got a real example but I wouldn't like to use it in case the brand mentions flags up in their tools and they spot this post, but take this search for example "Service+Location" You will get 'service+location.com' rank #1 Why? Heaven knows. They have less than 100 backlinks which are of a very low, spammy quality from directories. The content is poor compared to the competition and the competitors have amazing link profiles, great social engagement, much better website user experience and the data does not prove anything. All the competitors are targeting the same search term but yet the worst site is ranking the highest. Why on earth is Google not fixing this issue. This page we are seeing rank #1 do not even deserve to be ranking on the first 5 pages.

    | Jseddon92
    0

  • I was wondering, do you know when you see updated results for a sporting event in the google search. Are those the result of structured data?

    | mycujoo
    0

  • Hi, When redirecting an entire site to another domain, do you have to maintain the SSL certificate? The SSL expires 3 days before the planned redirect. Thanks in advance.

    | sofla_seo
    0

  • Hi Moz-Fans 🙂 I'm doing SEO for about a year now and have a new site to which I do not know where to improve any further. The main keyword is "Webdesign Freiburg" and the site is werkzeug    -    kasten    .   com Anyone want to have a look into and tell me what might bring us from page 2 to page 1 on google? Thanks a lot Marc

    | RWW
    0

  • I have a site that is a marketplace. We don't own any items, the sellers fill everything out and then it goes up on the site. Many of our sellers also have their own sites and just send us a spreadsheet with all of their items and we bulk upload. In that case what we are putting up is very similar to what they already have up on their own site. I used the Fruition penalty checker and they seem to be suggesting that we got hit with some penalties for Panda and Quality Content. With the Google Algorithm it is hard to know for sure what we got hit with. Is it possible Google sees us as one of those crappy scraper sites? Is there anything we can do? We never see the items so I can't add to peoples description.

    | EcommerceSite
    0

  • My site has several different homepage versions.  I am running on the Volusion eCommerce. www.mydomain.com   -   Page Authority 44
    www.mydomain.com/Default.asp   -  Page Authority 33
    www.mydomain.com/default.asp   -  Page Authority = 33 So here is the question, is it normal to have different page Authorities for each version?  Is this diluting my SEO for the homepage?  Any input on this would be appreciated.

    | PartyStore
    0

  • A website keeps showing more articles when pressing a "load more" button. This loads additional category pages with a page parameter (e.g. ...?page=1, ...?page=2, etc.), as suggested by Google to get all pages indexed. The problem is that this creates thousands of additional, duplicate pages, with a duplicate title, header, and very unfocused content. They also show as duplicate content in Moz. The pages are indexed by Google, but none of them is ranking. What do you guys think: add a no-follow to the load-more button, so search engines will never see them? Thanks for your input!

    | corusent
    1

  • I understand 301 redirect carries over the page value to the page its being redirected to. However what happens if for example, I do a 301 redirect from example.com to example.co.uk, 2 months later I take down hosting and cancel domain for example.com, would I lose the page value that was being carried over to example.co.uk? Do I need to keep both domains active?

    | Marvellous
    0

  • Hi Hope someone can offer some insight here. We have a site with an active forum. The transactional side of the site is about 300 pages totals, and the forum is well over 100,000 (and growing daily) meaning the 'important' pages account for less that 0.5% of all pages on the site. Rankings are pretty good and we're ticking lots of boxes with the main site, with good natural links, logical architecture, appropriate keyword targeting. I'm worried about the following: crawl budget PR flow Panda We actively moderate the forum for spam and generally the content is good (for a forum anyway), so I'm just looking for any best practice tips for minimising risk. I've contemplated moving the forum to a subdomain so there's that separation, or even noindexing the forum completely, although it does pull in traffic. Has anyone been in a similar situation? Thanks!

    | iProspect_Manchester
    1

  • Hi, Since I love writing and I write a lot I always find myself worried about ruining for my self with Content Cannibalism. Yesterday, while looking to learn about diamonds I encountered a highly ranked website that has two pages ranking high on the first page simultaneously (4th and 5th) - I never noticed it before with Google. The term I googled was "vvs diamonds" and the two pages were: http://bit.ly/1N51HpQ and http://bit.ly/1JefWYS Two questions: 1. Does that happen often with Google (presenting two lines from the same site on first page)? 2. Would it be better practice for the writer to combine them? - creating a one more powerful page... Thanks

    | BeytzNet
    1

  • Our developer has suggested that we alter our HTML so the important content appears at the very top of the source code and Google can index our pages more efficiently. Is this a worthwhile improvement in terms of improving ranking? Our developer describes the improvement in this manner: sort-order of the important content inside the code, so we may have similar text code ratio at the end but the important code we need Google to index will be at the very top in the source code, in terms of a very technical approach Google will find the key content faster and that should help to improve the crawling process as search engines read HTML code linearly. This change do not necessarily will affect the HTML, we can achieve it by using style sheet (CSS code) instead, reducing the chance of major BUGs. Thanks, Alan

    | Kingalan1
    0

  • Hi, I've just sold the domain for a website, so I'm free to re-purpose the content over to another website I own. How can I make sure that Gg doesn't deem it as duplicate? Do I need to let Gg naturally realise that the 'original' website no longer has the content on it? Do I need to hold-off putting the content live again? Should I notify Gg by-way of a de-index request, etc (assuming the domain won't incur any difficulty if I do this)? Thanks in advance.

    | newstd100
    0

  • I have seen many instructions but I am still uncertain. Here is the situation We will be implementing rel prev rel next on our paginted pages. The question is: Do we implement self referencing canonical URL on the main page and each paginated page? Do we implement noindex/follow meta robots tag on each paginated page? Do we include the canonical URL for each paginated page in the sitemap if we do not add the meta robots tag? We have a view all but will not be using it due to page load capabilities...what do we do with the viewl all URL? Do we add meta robots to it? For website search results pages containing pagination should we just put a noindex/follow meta robots tag on them? We have seperate mobile URL's that also contain pagination. Do we need to consider these pages as a seperate pagination project? We already canonical all the mobile URL's to the main page of the desktop URL. Thanks!

    | seo32
    0

  • We are seeing the following trend in our rankings and traffic after the recent Google algorithm updates (May 2015 quality/phantom, and July 2015 Panda), and I am curious if anyone here has encountered similar and/or has any good ideas on how to react. Background - we operate in a niche segment, but compete for keywords with large home improvement stores and mass retailers. In the past, prior to May 2015, we generally ranked higher than the large home improvement stores and mass retailers for our key specific terms in our niche. We believed that it was because we have a very specialized focus and so our store was highly relevant for someone searching in that niche (for example for the name of the product category as a keyword). In general, we ranked #1-3. Along with a few of our competitors in our niche. And then would be the big box home improvement stores in spots 5-10. The change we saw starting in May is that now all the home improvement stores and also a few large multi-category retailers took over those top 5 spots and bumped all the specialty retailers and the specialty brand manufacturers (like us) down. Our direct competitors in our niche all seem to have been impacted pretty much the same as us. So, in summary it seems like these latest updates may have favored the more general retailers but with stronger domain authority than the more specific but smaller retailers. Hard to know for sure, but this is the trend we believe we see. So, that said, what are some good strategies to respond to this situation? We can't really compete on overall domain authority with these huge retailers. And our previously successful strategy of having a very focused niche, with lots of helpful content, videos, instructional guides, etc. no longer seems to be enough. Has anyone else seen similar results since this past May? Where specialty retail or brand sites lost ground to larger general retailers? And if so, has anyone found any good strategies to gain back their previous rankings, or at least partially?

    | drewk
    1

  • For example, chicago-company.com has the same content as springfield-company.com and I am searching for a general non-brand term (i.e. utility bill pay) and am located in Chicago. How can we optimize the chicago-company.com to ensure that chicago's site results are in top positions over springfields site?

    | aelite
    1

  • We are in the process of using SVG to create an interactive map that will help site visitors browse product listings by region. (I work for a real estate company) For example, if a visitor were to click on the Florida portion of this US map, they would be taken to our Florida product page via a link setup that looks like: <a  xlink:href="/luxury-homes/Florida">... </a> Can anyone help me get a better understanding of how these xlinks are handled for SEO purposes? Are they indexed like regular links by search engines? do they pass link juice? Thanks,

    | AaronPC
    0

  • I'm helping someone with a new site that will have pages for organic search that contain embedded video. Some will be youtube embeds and some will be wistia embeds. These pages will have several hundred words of transcript text and the embeds (iframes) iframes themselves will be rather small, but expandable and possibly more than one per page. The transcript text area is more like 80% of the page. Do you think this is an organic search problem? I use one site audit tool that calls this out as a serious warning. Currently, the embedded player(s) are a column down the left side, about 1/4th of the width of the page,  and the transcripts are everything else, wrapping around it.  The transcripts are fully readable and not hidden in some kind of expandable accordion or anything. Does layout matter in this issue? Thanks... Darcy

    | 94501
    0

  • We have an E commerce site and we have started to implement Schema's. I've looked around quite a bit but could not find any schema's for product categories. Would there be any schema's to add besides an image, description, & occasional PDF?

    | Mike.Bean
    0

  • Hi Mozers, Can anyone tell me how to set the rel="canonical tags via SEO YOAST? I have an article posted on my blog that was published first on another blog and i need to reference this entry somehow, I have been told to use rel="canonical - if so I would appreciate some insight on how to do this exactly! Thanks very much in advance

    | edward-may
    0

  • Is it possible to find out where traffic is coming from on someone else website? I want to know where the new buyers are coming from who are interested in outsourcing. Attached are some of the pages they would be looking at. Who are visiting these pages and where are they coming from: https://www.upwork.com/blog/ https://www.upwork.com/hiring/ https://www.upwork.com/i/howitworks/client/ https://www.upwork.com/signup/create-account/client_direct https://www.upwork.com/o/profiles/browse/ https://www.upwork.com/press/ https://www.freelancer.com/ https://www.freelancer.com/about https://www.freelancer.com/info/how-it-works.php https://www.freelancer.com/showcase https://www.freelancer.com/community https://www.freelancer.com/hire/ https://www.freelancer.com/contest/ https://www.freelancer.com/feesandcharges/ https://www.freelancer.com/freelancers/ http://www.guru.com/ http://www.guru.com/howitworks.aspx http://www.guru.com/about/ http://www.guru.com/help/ http://www.guru.com/blog/ http://www.guru.com/blog/category/hiring-advice/ http://www.guru.com/d/freelancers/ http://www.guru.com/directory http://www.guru.com/answers/

    | Hall.Michael
    0

  • Our website has all social media buttons for Facebook, Twitter, LinkedIn and Google+ are located in the footer of all pages. These links are set to "no-follow". Running an SEMRUSH audit shows these "no-follows" coming up as an "issue". Is it best practices to set these links to social media sites as "follow" as opposed to "no-follow"? I am somewhat concerned about losing link juice but perhaps that is an outdated point of view. Any thoughts?? Thanks, Alan

    | Kingalan1
    1

  • I am merging a niche site, tshirts.com to another site mainsite.com. I am using an htaccess file on a linux server, and the homepage of the niche site is being directed to the corresponding category page on the main site (i.e nichesite.com to mainsite.com/niche.html). Everything else is also a page to page redirect. I have something like this in the htaccess file: Redirect 301 http://tshirts.com/ http://www.mainsite.com/tshirts.html
    Redirect 301 http://tshirts.com/blue.html  http://www.lampclick.com/blue-t-shirts.html
    Redirect 301 http://tshirts.com/white.html http://www.mainsite.com/white-t-shirts.html
    Redirect 301 http://tshirts.com/black-tshirts.html http://www.mainsite.com/bk-t-shirts.html When I check 301 for lets say http://tshirts.com/blue.html, I get: http://tshirts.com/blue.html -** 301 Moved Permanently** http://www.mainsite.com/tshirts.htmlblue.html -** 302 Found** http:www.mainsite.com/ How do I fix this? Why is everything being appended to minsite/tshirts.html? I appreciate your help.

    | inhouseseo
    0

  • Hi there, I'm just reaching to to ask for some help in understanding where 301 redirects should be set up on a website when keeping the same domain but not preserving the original filenames? Essentially what is happening is an old website is being completely overhauled and brought up to date from a technical and usability standpoint. While the SEO isn't great naturally many of the pages have been indexed by google over time. A few pages have decent statistics and I don't want to lose the juice from them, but they do still need a lot of improving. So my question is this, would all the redirection take place in the .htaccess file only in this case? From reading here on Moz I think this is the case, but I need to confirm that. I was reading this article which has thrown me slightly: https://moz.com/learn/seo/redirection but this seems more complex as the website was actually moving domains. Open to any insight and if you need further clarification or information let me know.

    | SEODarren
    0

  • If you google any persons name who has a linkedin profile and then locate that entry in the search engine results (linkedin profiles are usually first page for most people) you will see that they get microdata indexed which is basically the persons location and headline from their profile. Looking at their markup, i see location which makes sense as it is an hcard format, but I do not see any microformat data around the headline. Any ideas how they get this? wDQcGZY

    | stacks21
    0

  • Hello, Does anyone have any recommendations on good SSL providers? We are looking for 2 levels - cheap ones we can use for our clients (and potentially EV ones) and secure ones we can use where needed. I am not sure if there are any cheap and secure ones or if this a contradiction. I await everyone's input and recommendations. Thanks in advance

    | JohnW-UK
    1

  • I am a WordPress designer and can service national clients. I want to rank on the first page for the keyword WordPress designer. This is a highly competitive keyword MOZ gives it a 68% difficulty. I rank locally in nj for WordPress designer nj and a multitude of local WordPress based key words. I rank on page 10 of serps doing a private browsing session. Some of my national competitors rank on the first page with not many links in. How could i be so low on the serps for the base keyword WordPress Designer but for local WordPress designer nj be number 1 everytime? What should be my best plan of attack if i want to Rank much higher for this one specific keyword hopefully page 1. I think this one keyword could increase business greatly so it is a top priority for me. I appreciate all advice 🙂

    | donsilvernail
    0

  • So you've been MOZing and SEOing for years and we're all convinced of the 10x factor when it comes to content and ranking for certain search terms... right? So what do you do when some older sites that don't even produce content dominate the first page of a very important search term? They're home pages with very little content and have clearly all dabbled in pre Panda SEO. Surely people are still seeing this and wondering why?

    | wearehappymedia
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.