Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi guys, we have a client whose main site is .com but who has a .co.uk and a com.au site promoting the same company/brand. Each site is verified locally with a local address and phone but when we create content for the sites that is universal, should I rel=canonical those pages on the .co.uk and .com.au sites to the .com site? I saw a post from Dr. Pete that suggests I should as he outlines pretty closely the situation we're in: "The ideal use of cross-domain rel=canonical would be a situation where multiple sites owned by the same entity share content, and that content is useful to the users of each individual site." Thanks in advance for your insight!

    | wcbuckner
    0

  • Hi all, If I had a global domain but with local country pages on it, i.e. xxxx.com/uk/xxxx xxxx.com/usa/xxxxx xxxxx.com/au/xxxx What's the best way to ensure that the relevant country gets the relevant pages. I.e.  the /uk/ pages show in the UK, /usa/ pages in the USA, /au/ pages in Australia. etc. etc. Is this a Google Webmaster tools setting? Thanks!

    | Diana.varbanescu
    0

  • Hello, We currently have a lot of color variations on multiple products with almost the same content. Even with our canonicals being set, Moz's crawling tool seems to flag them as duplicate content. What we have done so far: Choosing the best-selling color variation (our "master product") Adding a rel="canonical" to every variation (with our "master product" as the canonical URL) In my opinion, it should be enough to address this issue. However, being given the fact that it's flagged as duplicate by Moz, I was wondering if there is something else we should do? Should we add a "noindex,follow" to our child products and "index,follow" to our master product? (sounds to me like such a heavy change) Thank you in advance

    | EasyLounge
    0

  • We have a www.domain.net that domain forwards to www.domain.com. About 5 days ago, when searching for our brand term, I noticed that www.domain.net took the top position, and most of our www.domain.com rankings have dropped.  The www.domain.net is set to forward to www.domain.com. Any ideas what could cause something like this to happen?

    | crapshoot
    0

  • My website report says http://www.enigmacrea.com/diseno-grafico-portafolio-publicidad and http://www.enigmacrea.com/diseno-grafico-portafolio-publicidad?limitstart=0 Has the same content so I have duplicate pages the only problem is the ?limitstart=0 How can I fix this? Thanks in advance

    | kuavicrea
    0

  • We are looking at implementing HTTPS for our site. I have done a little research but can't find anything recent, http://moz.com/community/q/duplicate-content-and-http-and-https is the most recent thing I found. Does everything in the answers still apply? Should I just do a 301 redirect to all new https? Or add a canonical tag?

    | EcommerceSite
    0

  • Hello, recently we had a lot of content written for our new website. Unfortunately me and my partner have went separate ways, and he has used all my unique content on his own website. All our product descriptions, about us etc, he simply changed the name of the company. He has agreed to take the content down, so that i can now put this content on our new website which is currently being designed. Will google see this as duplicate content as it has been on a website before? Even though the content has been removed from the original website. I was worried as the content is no longer "fresh" so to speak. Can any one help me with this,

    | Alexogilvie
    0

  • Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot. I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL. Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page. What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice: put rel canonical tag on all of the pages with parameters and point to "root" use the google parameter tool and have it not crawl any urls with my parameters put meta no robots on the parameter pages Thanks!

    | jcgoodrich
    0

  • How could duplicate content be valuable and why question no indexing it? My new client has a clever african safari route builder that you can use to plan your safari. The result is 100's of pages that have different routes. Each page inevitably has overlapping content / destination descriptions. see link examples. To the point - I think it is foolish to noindex something like this. But is Google's algo sophisticated enough to not get triggered by something like this? http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-july-november 
    http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-december-june

    | Rich_Coffman
    0

  • Hi Mozzers, We have recently rebranded with a new company name, and of course this necessitated us to relaunch our entire website onto a new domain. I watched the Moz video on how they changed domain, copying what they did pretty much to the letter. (Thank you, Moz for sharing this with the community!) It has gone incredibly smoothly. I told all my bosses that we may see a 40% reduction in traffic / conversions in the short term. In the event (and its still very early days) we have in fact seen a 15% increase in traffic and our new website is converting better than before so an all-round success! I was just wondering if you thought I should redirect my XML sitemap as well? So far I haven't, but despite us doing the change of address thing in webmaster tools, I can see Google processed the old sitemap xml after we did the change of address etc. What do you think? I know we've been very lucky with the outcome of this rebrand but I don't want to rest on my laurels or get tripped up later down the line. Thanks everyone! Amelia

    | CommT
    0

  • I have a client who has definitely been penalized, rankings dropped for all keywords and hundreds of malicious backlinks when checked with WebMeUp....However, when I run the backlink portfolio on Moz, or any other tool, they don't appear anyone, and all the links are dead when I click on the actual URL. That being said, I can't disavow links that don't exist, and they don't show up in Webmaster Tools, but I KNOW this site has been penalized. Also- I noticed this today (attached). Any suggestions? I've never come across this issue before. xT6JNJC.png

    | 0102345
    0

  • Hello, We are having some issues upgrading our stack and maintaining Wordpress for our blog.  So we are thinking about splitting them up.  What are the SEO implications of moving our blog to a subdomain? Our blog URL structure is currently something like https://www.aplossoftware.com/blog/p/2470/fund-accounting/yearend-closing-checklist/. We would like to change to something like https://blog.aplossoftware.com/p/2470/fund-accounting/yearend-closing-checklist/

    | stageagent
    0

  • The site below has social media properties and other sites coming up before it's own listing even for the exact search of the site name. Any ideas why this is happening? Link Any input is appreciated.

    | SEO5Team
    0

  • Hi, I work on a site that has a robust q&a forum. Members post questions and other members answer the questions. The answers can be lengthy, often by experts with Google+ pages and almost always by multiple member/commenters answering a particular question. Much like Moz's forum here. In order to get rich snippets results in search for a single Q&A page, what would happen if each of, for instance, 10 commenters on a page, were tagged as author?  After all, the q/a forum pages have many authors, each as author of their own comments. Or, should I pick one comment out of many and call that member/commenter the author or something else? If it matters, the person asking the question in the forum is almost always not the expert providing a ton of detailed content. Also, a question might be 8 words. One answer might be 25 to 500 or more and their might be 5 to 10 different answers. Thanks! Cheers... Darcy

    | 94501
    0

  • Hi, Every once in a while I have a need to add a graph / chart to my site.
    Google offers a nice HTML5 chart builder and so do other web apps. The question is... Which should I use?
    A pinnable and sharable image graph or Interactive html5 graph Thanks

    | BeytzNet
    0

  • I know there have been several people who have asked this but a lot of them were back in 2012 before many of the google changes. My question is the same though. With all the changes with Google's algorithm. Is it okay to put your link on the bottom of your clients website. Like Web Design by, etc. Part of the reason is to drive traffic but also if someone is actually interested who designed the website, they will click it. But now reading about how bad links can hurt you tremendously, it makes me second guess if this is ok. My gut feeling says, no.

    | blackrino
    0

  • Is it possible to avoid a duplicate content error without limiting a post to only one category or tag?

    | Mivito
    0

  • Hi All, We just discovered that Google is indexing a subset of our URL’s embedded with our analytics tracking parameter. For the search “dresses” we are appearing in position 11 (page 2, rank 1) with the following URL: www.anthropologie.com/anthro/category/dresses/clothes-dresses.jsp?cm_mmc=Email--Anthro_12--070612_Dress_Anthro-_-shop You’ll note that “cm_mmc=Email” is appended. This is causing our analytics (CoreMetrics) to mis-attribute this traffic and revenue to Email vs. SEO. A few questions: 1)    Why is this happening? This is an email from June 2012 and we don’t have an email specific landing page embedded with this parameter. Somehow Google found and indexed this page with these tracking parameters. Has anyone else seen something similar happening?
    2)    What is the recommended method of “politely” telling Google to index the version without the tracking parameters? Some thoughts on this:
       a.    Implement a self-referencing canonical on the page.
              -    This is done, but we have some technical issues with the canonical due to our ecommerce platform (ATG). Even though page source code looks correct, Googlebot is seeing the canonical with a JSession ID.
       b.    Resubmit both URL’s in WMT Fetch feature hoping that Google recognizes the canonical.
              -    We did this, but given the canonical issue it won’t be effective until we can fix it.
       c.    URL handling change in WMT
              -    We made this change, but it didn’t seem to fix the problem
       d.    301 or No Index the version with the email tracking parameters
              -    This seems drastic and I’m concerned that we’d lose ranking on this very strategic keyword Thoughts? Thanks in advance, Kevin

    | kevin_reyes
    0

  • Per Page Speed recommendations I specified the Expires header in my .htaccess file. Do I need to add code for Last-Modified too? I thought I read somewhere that it will put the date next to the meta description in the SERPS, which might cause the result to seem outdated after a while. Are there any problems that could crop up if these aren't implemented correctly.

    | kimmiedawn
    0

  • We have a few sport news websites that are picked up by Google News. Once in a blue moon, one of our articles ranks for a great keyword and shows in one of the 3 listings that Google News has in SERPS. Any tips on how we can we optimise more of our articles to compete in these 3 positions?

    | betnl
    0

  • Hi folks, We are planning to implement a cross-domain canonical tag for a client and I'm looking for some information on bing supporting cross-domain canonical tag. Does anyone knows if there was a public announcement made by Bing or any representative about the support of this tag? Btw, the best info I've found is a Q&A here on Moz about it http://moz.com/community/q/does-bing-support-cross-domain-canonical-tags but I'm looking for a Bing information on the topic.

    | fabioricotta-84038
    0

  • Case: We are currently in the middle of a site migration from .asp to .net and Endeca PageBuilder, and from a homebrewed search provider to Endeca Search. We have migrated most of our primary landing pages and our entire e-commerce site to the new platforms. During the transition approximately 100 of our primary landing pages were inadvertently 302ed to the new version. Once this was caught they were immediately changed to 301s and submitted to the Google’s index through webmaster tools. We initially saw increases in visits to the new pages, but currently (approximately 3 weeks after the change from 301 to 302) are experiencing a significant decline in visits. Issue: My assumption is many of the internal links (from pages which are now 301ed as well) to these primary landing pages are still pointing to the old version of the primary landing page in Google’s cache, and  thus have not passed the importance and internal juice to the new versions. There are no navigational links or entry points to the old supporting pages left, and I believe this is what is driving the decline. Proposed resolution: I intend to create a series of HTML sitemaps of the old version (.asp) of all pages which have recently been 301ed. I will then submit these pages to Google’s index (not as sitemaps, just normal pages) with the selection to index all linked pages. My intention is to force Google to pick up all of the 301s, thus enforcing the authority channels we have set up. Question 1: Is the assumption that the decline could be because of missed authority signals reasonable? Question 2: Could the proposed solution be harmful? Question 3: Will the proposed solution be adequate to resolve the issue? Any help would be sincerely appreciated. Thank you in advance, David

    | FireMountainGems
    0

  • Hello, I am little confused about providing dofollow links from our website. We have a social shopping website where users can create catalog of their favorite products by bookmarking them from other websites, so our website might have thousands of outbound links. Now the confusion is whether we should have these links "nofollow" or "dofollow"? As per my understanding dofollow links will pass juice to other websites but on the other hand it might benefit us as well, sellers might come and bookmark their products for getting dofollow links. I read somewhere that if we have quality outbound links around a topic, google treats us as hub for that topic. But I am not clear if we will get this advantage only when these links are dofollow? Please help.

    | saurabh1905
    0

  • i had some rich snippets (recipes nad stars) showing on my site, but the last few days they have gone, has anyone had this happen, if so what did you do to get them back? The example URL is as follows http://www.gourmed.gr/syntages/pestrofa-sto-tigani-synodeyomeni-me-sauvignon-2003-karypidis everything seems ok in Google Structured Data Testing Tool. Any thoughts on why?

    | canonodigital
    0

  • What is the best approach to make my sites ready for mobile, in terms of SEO ? Is it better to create a subdomain called "m.mydomain.com" and redirect mobile users to that domain with a lite version of my sites? Or is it better to just keep the same domain as for my desktop version "mydomain.com" and use a WordPress theme that fits for all gadgets, for example Twenty Fourteen WordPress Theme, that adapts to each device? I see that most big sites use a "m.mydomain.com" subdomain for the mobile version, however, I don't see any sense in creating a subdomain of the site, when you can just use the WP adapting theme in the main domain. Any insight please? Thanks!

    | BloggerGuy
    0

  • Does anybody know of a website that can let you know when an external link was created to a site? Or any other way of finding this info out. Thanks

    | RobSchofield
    0

  • I have a page (page 1) with a lot of unique content which may rank for "Example for sale". On this page I Interlink to a page (page 2) with very limited unique content, but a page I believe is better for the user with anchor "See all Example for sale". In other words, the 1st page is more like a guide with items for sale mixed, whereas the 2nd page is purely a "for sale" page with almost no unique content, but very engaging for users. Questions: Is it risky that I interlink with "Example for sale" to a page with limited unique content, as I risk not being able to rank for either of these 2 pages Would it make sense to "no index, follow" page 2 as there is limited unique content, and is actually a page that exist across the web on other websites in different formats (it is real estate MLS listings), but I can still keep the "Example for sale" link leading to page 2 without risking losing ranking of page 1 for "Example for sale"keyword phrase I am basically trying to work out best solution to rank for "Keyword for sale" and dilemma is page 2 is best for users, but is not a very unique page and page 2 is very unique and OK for users but mixed up writing, pictures and more with properties for sale.

    | khi5
    0

  • A member of our marketing team wants to use Outbrain Select to curate content to augment the original content we have on our site. http://www.outbrain.com/select/how This content would be taken from other pages on the site and shown through as though it is on our site through javascipt. Obviously this page would be setup as a no-index. This seems to me like something Google would frown on. Does anyone know the SEO implications behind using a tool like this? I'm concerned google will see links to a blank page no-index page and find it suspect.

    | LyntonWeb
    0

  • I've got a wordpress website (a client) and MOZ keeps showing missing meta descriptions.  When I look at the pages these are nonsense pages, they do exist somewhere but I am not seeing them on the backend.  Questions: 1) how do I fix this?  Maybe it's a rel con issue? why is this referring to "non-sense" pages?  When I go to the page there is nothing on it except maybe an image or the headline, it's very strange. Any input out there I greatly appreciate. Thank you

    | SOM24
    0

  • We are building several sites for several clients which will be using images from the manufacturer.  Our dev team wants to insert the manufacturer's url for the images, instead of actually downloading the image and hosting on our server.  There are thousands of images, so downloading images to our server will be time consuming, so we are looking for a shortcut.... however I'm concerned this will cause other issues. Is using manufactueresdomain.com/12345.jpg going to cause SEO issues?  will this generate Google penalties?  Since we are not able to control the image file name, we cannot optimize it.  We will add Alt text and Title tag for each image, but the file name is random characters.  How important is the file name for SEO?

    | Branden_S
    0

  • Hey as you know that as a seo we are, we always optimize keywords which are at least 2 words, and lets say I'm trying to optimize a page for terms like "man clothing, man london clothing, man great collection, man stylus collection" and as you can guess I optimize this pages for this keywords by inputting them into title heading tags and body.
    So my question is , what if google takes "man" phrase from my 2 words keywords, and pretend as a my keyword. (I mean what if google thinks my keywords is man because as you can see in all of the keywords "man" is in all of them.)
    And what if Google thinks the density of "man" probably would be %20 which is astronomic number.? Sorry for my bad english.

    | atakala
    0

  • Dear all, I was dealing with a penalized domain (Penguin, Panda), hundred of spamy links (Disavoved with no success), tiny content resolved in part and so on .... I think the best way is to start a new fresh domain but we want to use some of the well written content from the old (penalized site). To do this task I will mark as NOINDEX the source (penalized) page and move this content to the new fresh domain. Question: do you think this is a non-dangerous aprouch or do you know other strategy? I'll appreciate your point of view Thank you

    | SharewarePros
    0

  • Any recommended hosting company, name of good package to buy ( shared hosting, VPS hosting, or Dedicated hosting). Which one to buy to help in website ranking.

    | AlexanderWhite
    0

  • Im managing a blog that has a lot of articles with Page Authority 1.I have already checked with On-page Grader that these articles are Grade A, so they have the SEO structure perfect and would like to know any ideas to get this Page Authority rise in existing articles that are already written, like changes that can effectively be made this page authority get higher. Thanks in advance and regards, Jorge Pascual

    | goperformancelabs
    0

  • *Was trying to get this question answered in another thread but someone marked it as "answered" and no more responses came. So the question is about best practices on TLDs vs ccTLDs.  I have a .com TLD that has DA 39 which redirects to the localized ccTLDs .co.id and .com.sg that have DA 17.  All link building has been done for the .com TLD.  In terms of content, it sometimes overlaps as the same content shows up on both the ccTLDs. What is best practices here?  It doesnt look like my ccTLDs are getting any juice from the TLD.  Should I just take my ccTLDs and combine them into my TLD in subdomains?  Will I see any benefits? Thanks V j3LWnOJ

    | venkatraman
    0

  • Hey everyone, My homepage has not been ranking for it's primary keyword in Google Australia for many months now. Yesterday when I was using a UK Proxy and searching via Google UK I found my homepage/primary keyword ranked on page 8 in the UK. Now in Australia my website ranks on page 6 but it's for other pages on my website (and it always changes from different page to page). Previously my page was popping up at the bottom of page 1 and page 2. I've been trying many things and waiting weeks to see if it had any impact for over 4 months but I'm pretty lost for ideas now. Especially after what I saw yesterday in Google UK. I'd be very grateful if someone has had the same experience of suggestions and what I should try doing. I did a small audit on my page and because the site is focused on one product and features the primary keyword I took steps to try and fix the issue. I did the following: I noticed the developer had added H1 tags to many places on the homepage so I removed them all to make sure I wasn't getting an over optimization penalty. Cleaned up some of my links because I was not sure if this was the issue (I've never had a warning within Google webmaster tools) Changed the title tags/h tags on secondary pages not to feature the primary keyword as much Made some pages 'noindex' to try and see if this would take away the emphases on the secondary pages Resubmitted by XML sitemaps to Google Just recently claimed a local listings place in Google (still need to verify) and fixed up citations of my address/phone numbers etc (However it's not a local business - sells Australia wide) Added some new backlinks from AU sites (only a handful though) The only other option I can think of is to replace the name of the product on secondary pages to a different appreciation to make sure that the keyword isn't featured there. Some other notes on the site: When site do a 'site:url' search my homepage comes up at the top The site sometimes ranked for a secondary keyword on the front page in specific locations in Australia (but goes to a localised City page). I've noindexed these as a test to see if something with localisation is messing it around. I do have links from AU but I do have links from .com and wherever else. Any tips, advice, would be fantastic. Thanks

    | AdaptDigital
    0

  • Good morning / afternoon / evening all, We are continually working our website - www.movingeverywhere.co.uk , it has suffered some drastic drop in rankings with the last 2 google algorithm updates which we have been working to resolve. This has involved: Redesigning the website (responsive now) , increase of speed, reduction of code, better UX and generally better all round experience for the user. Signed up to Moz and resolved any issues which have been highlighted. (Hopefully fixed the last ones today) Investigated our inbound link profile to try and weed out any bad incoming links or any links that were damaging the site. Increased our social network profile and reach. We have done competitor analysis and we are beating all of our competitioers with on site factors as per Moz results but it appears we are missing something which means we are not reaping the fruits of our efforts at the moment. The site is wordpress and we read there could be a canonical issue with Wordpres ssites We are asking the Moz community for any guidance and assistance to try and diagnose any negative factors affecting the SEO effort on the site. Thank you for your time and help.

    | wtfi
    0

  • If one looks at a page on our client's website, (http://truthbook.com/urantia-book/paper-98-the-melchizedek-teachings-in-the-occident for example), there are a huge amount of links in the body of the page. All internal links are normal links. All external links arerel="nofollow" class="externallink" We have two questions: 1.       Could we be being penalized by google for having too many links on these pages? Will this show i our webmaster reports? 2.       If we are being penalized,  can we keep the links (and have no penalty) if we made the internal links rel="nofollow" class="externallink" as well? We need these internal links to help people use these pages as an educational tool.  This is why these pages also have audio and imagery. Thank you

    | jimmyzig
    0

  • cybercig.co.uk Languishing around 150-200 in the rankings, very barely making it above 70. But also ranks for Refillable Electronic Cigarette on the first page. Any ideas whats happening? Not a huge amount of links but I'd have thought it would've been much higher. I'd love to know opinions 🙂

    | jasondexter
    0

  • I'm working on a site that has directories for service providers and content about those services. My idea is to organise the services into groups, e.g. Web, Graphic, Software Development since they are different topics. Each sub domain (hub) has it's own sales pages, directory of services providers and blog content. E.g. the web hub has web.servicecrowd.com.au (hub home) web.servicecrowd.com.au/blog (hub blog) http://web.servicecrowd.com.au/dir/p (hub directory) Is this overkill or will it help in the long run when there are hundreds of services like dog grooming and DJing? Seems better to have separate sub domains and unique blogs for groups of services and content topics.

    | ServiceCrowd_AU
    0

  • We have an issue with Google indexing multiples of each page in our sitemap (www.upmc.com). We've tried using rel_canonical, but it appears that GoogleBot is not honoring our canonicals. Specifically, any of the pages Google indexes that end without a file extension, such as .aspx are 302 redirected to a .aspx page. Example - The following pages all respond as 302 redirects to http://www.upmc.com/services/pages/default.aspx http://www.upmc.com/services/ http://www.upmc.com/services http://www.upmc.com/Services/ http://www.upmc.com/Services Has anyone been able to correct this inherent issue with Sharepoint so that the redirects are at least 301s?

    | Jessdyl
    0

  • I'm looking for an SEO company that has substantial experience with the Magento shopping cart system. I've gone thru MOZ.com's Recommended List but I'm unsure of who specializes in Magento. Thanks.

    | UncleXYZ
    0

  • Is it possible to be penalized on an interior page but not the whole website? Here's why I ask, I have a page:  www.thesandiegocriminallawyer.com/domestic-violence.html that is not ranking well (p. 21 of Google) while the rest of the site ranks well (b/w p.1 to p.3). I checked the link profile in opensiteexplorer, ahrefs, and majesticseo but can't find any problems. I have also checked the HTML code, CSS, keyword optimization, but can't find any problems there either. Can anyone give me insight into why this might be happening? Of course, I'm working under the assumption that this page SHOULD be ranked higher for "San Diego Domestic Violence Attorney" - at least higher than page 21.

    | mrodriguez1440
    0

  • Does this count as duplicating content even though the meta description has no effect on search results?

    | USAMM
    0

  • Hey Guys, witnessSF.org (WP), witnessLA.org(Tumblr), witnessTO.com(WP), witnessHK.com(WP), and witnessSEOUL.com(new site no redirects needed) are being moved over to sf.ourwitness.com, la.ourwitness.com and so forth. All under on large Wordpress MU instance. Some have hundreds of articles/links others a bit less.  What is the best method to take, I understand there are easy redirects, and the complete fully manual one link at a time approach. Even the WP to WP the permalinks are changing from domain.com/date/post-name to domain.com/post-name? Here are some options: Just redirect all previous witinessla.org/* to la.ourwitness.org/ (automatic direct all pages to home page deal) (easiest not the best)2) Download Google Analytics top redirected domains about 50 urls have significant ranking and traffic (in LA's sample) and just redirect those to custom links. (most bang for the buck for the articles that rank manually set up to the correct place) 3) Best of the both worlds may be possible? Automated perhaps?I prefer working with .htaccess vs a redirect plugin for speed issues. Please advise. Thanks guys!

    | vmialik
    0

  • Hey guys, We're looking to move a blog feed we have to a new static URL page. We are using 301 redirects but I'm unsure of what to regarding page 2, page 3 etc. of the feed. How do I make sure those urls are being redirected as well? For example: Moving FloridaDentist.com/blog/dental-tips/ to a new page url FloridaDentist.com/dental-tips. So, we are using a 301 on that old url to the new one. My questions is what to do with the other pages like FloridaDentist.com/blog/dental-tips/page/3. How do we make sure that page is also 301'd to the new main url?

    | RickyShockley
    0

  • Hi All, When searching for our brand in Google France, I noticed that some of our major competitors show up beneath our Knowledge Graph listing. My managers and I are wondering if this is something Google just does as associated search or if there's a way that we can work around it. Thanks and please see attached image 🙂 2KYBeiT

    | CSawatzky
    0

  • Hi, I've just been researching backlinks and newswires... One of them told me they put nofollow links on news releases, but some of their network would repro the news release without nofollow links. I suspect the network includes some not so brilliant websites, so even if I just use URL rather than anchor text for backlinks I'm thinking there probably is a risk. The other seemed to have more control over network, from what they said - they didn't auto-syndicate and used nofollow, but still I suspect there's a risk that your news release will end up on not so good websites. Has anybody out there recently experienced problems with the backlinks produced by newswire services? Beyond that, your general input would be welcome too.

    | McTaggart
    0

  • http://www.oreillyauto.com/site/c/search/Wiper+Blade/03300/C0047.oap?make=Honda&model=Accord&year=2005&vi=1430764 How is O'Reilly getting this page indexed? It shows up in organic results for [2005 honda accord windshield wiper size].

    | Kingof5
    0

  • For one of our clients we are building a career site and putting it under a different URL and hosting service (mainly due to security concerns of hosting it under the same host and domain). almost 100% of the incoming traffic to their current career section (which it is in a sub-folder) receives traffic for branded keywords (brand + job/career/employment), that is, there are no job position specific keywords. The client is now worried that after moving the site, the inbound traffic to the main site will be severely affected as well as the SERP results. My questions are, will the non-career related SERPs be affected? I don't see how will they be but I could be wrong If no, how could we reassure her that the SEO to the main site wont be affected? are there any case studies of a similar case (splitting part of the website under a new URL and hosting service?) Thank you for your help. PS: this is my first post so please forgive me if this has been asked before. I could not find a good response.

    | rflores
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.