Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Any reason why I could be ranking for Google but not Bing?  

    | edward-may
    0

  • Hey there Mozzers, I am a question about internal links. If I am writing a article about something and want to link to another one of my articles inside my blog, do i have to make that link nofollow or dofollow? If possible tell me why also. Thanks in advance

    | Angelos_Savvaidis
    0

  • About to completely redo a client's site and I want to make sure I don't loose our link juice. The current site is a old template site from another provider.  They host it and we do not have access at all to the site itself, so there will be no transferring of the site from server to server because they feel the site is their property. Basically the site is a monthly service not a product. So this will be a completely new website, including new URL structure. So my question is how do keep the link juice flowing to the new site? I know I need to use 301 redirects, but do I rebuild those old URLs on my site and redirect them to their new counterpart or what? The link profile is not that impressive, maybe 15 back links (all mainly going to the homepage).  But they all are local and coming from pretty good domain authority. But its keeping us ahead of our competition. Back story: This is one of my local search clients, we now have them ranking #1 across the board in the local packs. After analyzing the traffic, they are losing 75% of all traffic because of the sites design. So a new site is a must.  I build a lot of websites, but have never worried about the back link profile before now. Thanks for all your help!

    | masonrj
    0

  • I've read through a lot of material to this point on the subject which has been helpful.  In making a major decision like this I'd love another set(s) of eyes on this.  A lot of the material I read is pretty dated.  Thanks for your help! Background:  Company is currently maintaining the following sites, some in multiple languages: company.com, company.us, company.de, company.fr, etc. (12 ccTLDs, some multi-lingual).  Each site represents a physical office/distributorship in each location.  Each ccTLD site (and pages) include both duplicate, and unique, localized content (intermixed).  Each country office will be producing content for their ccTLD, though some content will be duplicated from the .com.  In essence, there is a .com corporate site template and the ccTLDs will be customized but include a lot of the content on the .com corporate multi-lingual site. Some of the ccTLDs rank ok, some don't, all SEO strategy to date has been implemented by independent marketing companies in each country.  I am working on a centralized SEO strategic approach. Approach:  My initial thought was to leverage the .com domain internationally by consolidating all ccTLDs within the .com site using sub-directories.  Since some regional sites are also multi-lingual, the consolidated site structure might look like this... company.com/en-us/, company.com/en-de/, company.com/de-de/, company.com/en-fr/, company.com/fr-fr/.  This would allow for location-specific content to be presented in multiple languages. When I learned how much customization/localization will need to be done (each country maintaining its own blog,etc), and started evaluating things like the length of the urls for marketing purposes, the necessity to have multiple users accessing certain sections of the site, and some insight that the ccTLDs will likely rank better than the consolidated .com, research of other sites (amazon.com has ccTLDs for each country). I began to reconsider my initial strategy, and re-evaluate a .com corporate site in mult-languages with regional ccTLDs with a blend of duplicate and unique content instead. Beyond business needs, my primary concern is preventing duplicate content.  I can already see issues arising between the .com corporate multi-lingual site in French, for instance, and the company.fr regional site that would contain some of the same corporate content, and a lot of its own unique localized content.  I am imaging the corporate .com actually  having to defer to the ccTLDs via rel=canonical to avoid duplicate content issues which doesn't seem to natural (maybe just in the case where the .com corporate site would use rel=canonical to the .us office site) I've had a lot of success consolidating sites and working to build a single, strong, trusted, authoritative domain vs. having to build that same authority, in this case, a dozen times with all the ccTLDs.  I am not sure if the ability to leverage a single .com multi-regional/lingual site outweighs the benefit of a ccTLD for a site that operates in a single country.  What do you think? What solution would you recommend, all things considered?  Please let me know if I am missing something. I enjoyed the challenge of weighing all the factors and am at a point where I could really use some feedback from colleagues. The developers are building the site(s) in Drupal. Thanks!

    | seagreen
    0

  • My website is lonestarperio.com, I am loosing my website ranking, Is SEO agency doing any black hat SEO or doing anything wrong to my website?

    | bondhoward
    1

  • About a year (or 2) ago, Matt Cutts said that Twitter and FB have no effect on website rank, in part because Google can't get to the content. Now that Google will be indexing Twitter (again), do we expect that links in twitter posts will be useful backlinks for improving SERP rank?

    | Thriveworks-Counseling
    1

  • I am not a developer - I am researching this for our team, so please, be gentle... I am also not quite sure how to ask this question. We want to serve up custom pages for visitors from Google organic. We aren't doing anything underhanded - the pages will have very small differences that will not affect our rankings and won't land us in Google jail. When a Google visitor hits one of our pages, what specific piece of data are we looking for to determine: a. It's a Google visitor b. He/she came from organic results. I need to tell our developers to look for something that triggers the custom page. It's the same data that Google Analytics uses to trigger the appropriate visitor type. Please pardon my naivete.

    | AMHC
    0

  • The full URL of a particular site's homepage is something like http://www.example.com/directory/.
    The canonical and og URLs match.
    The root domain 301 redirects to it using the absolute path. And yet the SERP (and the cached version of the page) lists it simply as http://www.example.com/. What gives? Could the problem be found at some deeper technical level (.htaccess or DirectoryIndex or something?) We fiddled with things a bit this week, and while our most recent changes appear to have been crawled (and cached), I am wondering whether I should give it some more time before I proceed as if the SERP won't ever reflect the correct URL. If so, how long? [EDIT: From the comments, see here: https://www.youtube.com/watch?v=z8QKIweOzH4#t=2838]

    | TheEspresseo
    0

  • I was asked by our IT if switching to a tag management company that removes the pixels from our site and is replaced by a javascript would have a negative impact on SEO. I have not been able to find anything that discusses this: Does anyone have experience with this? Has it caused any issues? Has it caused any issues? How do crawlers see pixel data, and what do they do with it?

    | Shawn_Huber
    0

  • Good morning Moz community 🙂 What do you guys think would be the best practice as a starting blogger offering guest articles to other 3rd party blogs when it comes to building up my own website's SEO points (assuming I have a link in the guest article to my website). 1. If I have the opportunity to post the guest article on two+ different blogs, should I go for it? -OR- 2. Only post the article on one specific blog and write a different one for the others? In a world with unlimited resources, the latter option would prevail, but considering that it takes time to write, what would you recommend if I am trying to build my websites SERPs? Carlos

    | 90miLLA
    0

  • Hi again! I've got a site where around 30% of URLs have less than 250 words of copy. It's big though, so that is roughly 5,000 pages. It's an ecommerce site and not feasible to bulk up each one. I'm wondering if noindexing them is a good idea, and then measuring if this has an effect on organic search?

    | Blink-SEO
    1

  • HI, I have to change old url`s to new one, for the same domain and all landing pages will be the same: domain.com/old-url I have to change to: domain.com/new-url All together more than 70.000 url. What is best way to do that? should I use 301st redirect? is it possible to do in code or how? what could you please suggest? Thank you, Edgars

    | Edzjus333
    0

  • The benefits of alt tag optimization for traditional SEO has always been a "yo yo" subject for me.  Way back in the day (2004 to 2007) I believed there was some benefit to alt tag SEO.  However as time went on I saw evidence that the major search engines were no longer considering alt tag SEO as a ranking signal. However I later had the pleasure to work on a joint project with a high end SEO firm in 2011/2012.  My colleagues fully believed that alt tag optimization was still a very important strategy for traditional SEO at that time. Is there any evidence available that alt tags still help with traditional SEO nowadays?  I'm fully aware of the benefits of optimized alt tags and image search.  However could optimized alt tags be one of those ranking factors that Google removed due to abuse and later quietly resurrected?

    | RosemaryB
    0

  • Hi Guys, Any good tips on how to build PA and DA for a new website about 3 months old now, and our DA is at 5 while PA 1. Any good ideas from past experiences on how to build these up? Cheers

    | edward-may
    0

  • I'm working on Wordpress at the moment changing the content of a page on my website. The page has a lot of educational information and each section is unique. I had to go through and edit each section on google documents and now I'm posting all the new pages and making the old pages private on wordpress. Is this a good idea? I'm worried google will still crawl my private education pages and think these are duplicates since the new pages somewhat resemble the old. Also, should I be 301 redirecting all the old education pages to the corresponding new ones even if they are private on wordpress? I understand that the 301 redirect should only be used if you want the old page to go to a new one. What i don't understand id weather this will still be relevant or work if I've already made the old page private on wordpress. Thank You!

    | SapphireCo
    0

  • We have moved a clients website over to a new CMS and onto a new server. The Domain and URLs on the main pages of the website are exactly the same so we did not do any 301 re directs. The overall Domain Authority of the site and the Page Authority of the Homepage, while having dropped a bit seem OK. However all the other pages now have a Pagerank of 1 I'm not exactly sure what the IT guys have done but there was some re routing on the server level applied. The move happened around the end of December 2014 And yes traffic has dropped significantly Any ideas?

    | daracreative
    0

  • Hello, At my company, we have instances where client-facing people leave the company and so we need to remove their profile page from the website. As opposed to people receiving a 404 when they search for them, I thought it would be best to divert visitors to a generic landing page to explain that the person they are looking for has left the company with details on how to get in touch. I'm tempted to use a 302 redirect so the person they are searching for stays in the search results longer. But longer-term, will this cause any harm? Should it be eventually be turned into a 301 redirect? Or should I just use a 301 in the first instance. Thanks in advance, Stu

    | Stuart26
    0

  • Hello, I have a blogging platform which spammers unfortunately used a few years ago to create spam blogs. Since them, we've added spam filters and even if I can't not assume there isn't any spam blog left, I can say that most of the blogs are clean. The problem is, in Google Webmasters Tools, we have a Pure spam message in the Manual actions page. (https://support.google.com/webmasters/answer/2604777?hl=en), with a list of 1000 blog links. All these blogs have been marked as spam in our system for at least 1 year, technically it means they return a 410 header and display something like "this blog doesn't meet our quality requirements". When I've first seen the manual action message in GWT, I have asked for reconsideration request. Google answered within a week saying that they had checked again our website, but when I go went to the manual actions page, there was still a "pure spam" message, with a different list of blogs, which have already been marked as spam for a year at least. What should I do ? Ask for reconsideration requests as long as Google answers ? Thank you in advance,

    | KingLouis
    0

  • Hi everyone, The quick answers box can be really helpful for searchers by pulling through content which answers their question or provides a clear description of an item or entity. Our client appeared in the quick answer box for a period of time with their description of a product, but have since been replaced by one of their competitors. Previously, the answer was provided by Wikipedia. Is there anything we can do to help get our client's content back in there? We've been looking at possible structured data we can use but haven't found anything. Also suggesting our client ensures they have a paragraph within their copy which is a clear, concise description of the product that Google can pull. Can anyone give any suggestions? Thanks Laura

    | tomcraig86
    0

  • Hi All, Screaming frog has identified that we have a few H2 tags on our pages , although we only have 1 H1 tag. We have numerous H3,H4's etc. I am wondering, is it good SEO to have only 1 H2 tag like with H1 tag or can you have more ? thanks Peter

    | PeteC12
    0

  • Hi! We are in the process of overhauling our websites, and I am hoping that some of you can post URLs for websites that are ranking well and using lots of creative content to help rank their ecommerce category pages. You can post your own, or others that you admire.

    | AMHC
    1

  • Hello All, On our eCommerce site some products have additional information which we currently show via a PDF link next to the product. I am thinking, is it more beneficial from an SEO point of view , If I was to put this additional pdf information to a webpage and have a link going from the product to this . From what I read, google cannot read contents of pdfs so if I was to have this as webpage via a link , then the product page would get more keywords and strength around it which would help improve it's seo etc. Just wondered if this is the best way forward or not ? thanks Peter

    | PeteC12
    0

  • Hi all, We are running a classifieds platform in Spain (mercadonline.es) that has a lot of duplicate content. The majority of our duplicate content consists of URL's that contain site parameters. In other words, they are the result of multiple pages within the same subcategory, that are sorted by different field names like price and type of ad.  I believe if I assign the correct group of url's to each parameter in Google webmastertools then a lot these duplicate issues will be resolved. Still a few questions remain: Once I set f.ex. the 'page' parameter and i choose 'paginates' as a behaviour, will I let Googlebot decide whether to index these pages or do i set them to 'no'? Since I told Google Webmaster what type of URL's contain this parameter, it will know that these are relevant pages, yet not always completely different in content. Other url's that contain 'sortby' don't differ in content at all so i set these to 'sorting' as behaviour and set them to 'no' for google crawling. What parameter can I use to assign this to 'search' I.e. the parameter that causes the URL's to contain an internal search string. Since this search parameter changes all the time depending on the user input, how can I choose the best one. I think I need 'specifies'? Do I still need to assign canonical tags for all of these url's after this process or is setting parameters in my case an alternative solution to this problem? I can send examples of the duplicates. But most of them contain 'page', 'descending' 'sort by' etc values. Thank you for your help. Ivor

    | ivordg
    0

  • I submit infographic to visual.li, source and a little description. Are these links were good for website link profile? And can I submit same inforgraphi to other websites? http://visual.ly/divya-ashwagandha-churna

    | bondhoward
    0

  • I added all keywords in description. Will this affect my website, Google takes this as negative way? I am not adding keywords on my own website, but adding keywords to third party website? https://www.pinterest.com/pin/304555993526970292/

    | bondhoward
    0

  • Hi Guys, Google is not indexing the keyword ‘e liquid’ for www.cloudstix.com/e-liquid and it's driving me insane. I cannot understand why, can anyone please shed any light! -On page we have used variations e liquid, e-liquid, eliquid. -The e-liquid product pages are canonicaled to www.cloudstix.com/e-liquid. -3 other pages regarding e liquid were 301 redirected to the page passing good authority. I did this as I believed these pages conflicted as they seemed to target e-liquid. -‘e-liquid’ is being used as an anchor throughout the website pointing to www.cloudstix.com/e-liquid. -The ‘e liquid’ page has generally good authority PA 22, DA26. -The website has good anchor text linking to the site, all relative and e liquid related, along with brand links. Currently the keyword ‘e liquid’ brings up the home page www.cloudstix.com ranked 100+. What’s strange is the other terms relating to 'e liquid' bring up www.cloudstix.com/e-liquid for example: ‘e liquid uk’ ‘the best e liquid’ and ‘e liquid cloudstix’. Any ideas on what the problem may be. Would appreciate any advice on this. Thanks guys! Liam

    | One2OneDigital
    0

  • Hello Gurus hope everyone is having a fantastic day. Right so I've been pulling my hair with this 404 error on links as such: htttp://manvanlondon.co.uk/category/clients/removals/man-and-van-wandsworth This link appears in the category page clients and the  /removals/man-and-van-wandsworth part is the link that should take the user on the Man and Van Wandsworth page in the/ from the footer. However this link and all other links in the footer on these category pages/posts appear to be broken ONLY on this category pages, thus creating 404 errors. And those pages(i.e man and van Wandsworth) are not even categorized. The website is www.manvanlondon.co.uk . We tried various things on Wordpress and nothing is working including non-indexing. Has anyone met this problem before? Is there a way to fix it? Thank you for your time, and hope my explanations make sens. Monica

    | monicapopa
    0

  • Hi guys, I'm trying to understand the SEO behind our websites header. www.mountainjade.co.nz As you can see we have a paragraph of relevant introductory text that is also SEO friendly in our header. What I would like some help with is understanding how google views and assigns 'juice' to information like this in the header or footer of a website. Usually certain pages have content specific to a given topic, and google ranks these pages accordingly. But with a websites header / footer its content appears on every page as the header is always at the top and footer at the bottom. 1. In what way does my website benefit from the paragraph of text in the header? e.g at the domain level? Just the home page? etc etc 2. How does google assign 'juice' to the paragraph of text? (similiar to Q1). 3. How would my website be effected if I moved the text to the footer? (Aesthetic change) 4. When I 'inspect element' on the paragraph, it is labelled 'div id=site description.' Can someone please explain the relevance of a sites description to SEO for me. This paragraph of text was in the websites header before I came onboard, and I've been too concerned to change / move it as I don't know enough about it. Any help would be appreciated! Thanks team, Jake

    | Jacobsheehan
    0

  • Righto, so: I've been working on our company website www.nursesfornurses.com.au which is built on .asp which is a real pain because the site is built so messy and on a very dated CMS which means I have to go back to the dev every time I want to make a change. We've made the decision to move the site over to Wordpress in stages. So, (and I hope logically), i've started by making them a proper blog with better architecture to start targeting industry related keywords. I had to put it on a sub domain as the current hosting does not support Wordpress http://news.nursesfornurses.com.au/Nursing-news/
    The previous blog is here: http://www.nursesfornurses.com.au/blog Its not live yet, so I'm just looking for SEO advice or issues I might encounter by having the blog on a sub domain. In terms of user experience, I realise that there needs a clearer link back to the main website, I'm just trying to work out the best way to do it... Any advice / criticism is greatly welcomed. Thanks

    | 9868john
    0

  • I have a web site that has been around for a long time. The industry we serve includes many, many small vendors and - back in the day - we decided to allow those vendors to submit their details, including a link to their own web site, for inclusion on our pages. These vendor listings were presented in location (state) pages as well as more granular pages within our industry (we called them "topics). I don't think it's important any more but 100% of the vendors listed were submitted by the vendors themselves, rather than us "hunting down" links for inclusion or automating this in any way. Some of the vendors (I'd guess maybe 10-15%) link back to us but many of these sites are mom-and-pop sites and would have extremely low authority. Today the list of vendors is in the thousands (US only). But the database is old and not maintained in any meaningful way. We have many broken links and I believe, rightly or wrongly, we are considered a link farm by the search engines. The pages on which these vendors are listed use dynamic URLs of the form: \vendors<state>-<topic>. The combination of states and topics means we have hundreds of these pages and they thus form a significant percentage of our pages. And they are garbage 🙂 So, not good.</topic></state> We understand that this model is broken. Our plan is to simply remove these pages (with the list of vendors) from our site. That's a simple fix but I want to be sure we're not doing anything wring here, from an SEO perspective. Is this as simple as that - just removing these page? How much effort should I put into redirecting (301) these removed URLs? For example, I could spend effort making sure that \vendors\California- <topic>(and for all states) goes to a general "topic" page (which still has relevance, but won't have any vendors listed)</topic> I know there is no distinct answer to this, but what expectation should I have about the impact of removing these pages? Would the removal of a large percentage of garbage pages (leaving much better content) be expected to be a major factor in SEO? Anyway, before I go down this path I thought I'd check here in case I miss something. Thoughts?

    | MarkWill
    0

  • Hi, I wonder if anyone can help me. After a recent site relaunch caused a substantial loss of organic website traffic. http://www.health4mom.org/ the site went live 2 months ago(using the same url's as the previous one) unfortunately the organic traffic have dropped (-65%). I would appreciate if anyone can let me know why we lost organic traffic. Many thanks in advance. Antonio

    | One2OneDigital
    0

  • I am trying to setup permalinks on a wordpress blog that is installed on iis. I can't update the web.config file so I have to make every page /index/pagetitle. as shown here-http://codex.wordpress.org/Using_Permalinks#PATHINFO:_.22Almost_Pretty.22 How much of a difference is there between no /index and having the /index in there?

    | EcommerceSite
    0

  • Hi Guys, I am working on the SEO strategy of an adult e-commerce website and I don't understand how small competitors/websites with a poor domain authority, links profile and on-page optimisation are outranking us across the top high search volume search terms. We have already fixed several issues with the site including canonical tags, duplicate content, links profile, etc... I would be interested to get someone expert's opinion as we have been working on this over the last 12 months and haven't seen improvements. My email is "damien@bangonline.com.au". Thanks in advance!

    | Jon_bangonline
    0

  • I am working through the latest Moz Crawl Report and focusing on the 'high priority' issues of Duplicate Page Content. There are some strange instances being flagged and so wondered whether anyone has any knowledge as to why this may be happening... Here is an example; This page; http://www.bolsovercruiseclub.com/destinations/cruise-breaks-&-british-isles/bruges/ ...is apparently duplicated with these pages; http://www.bolsovercruiseclub.com/guides/excursions http://www.bolsovercruiseclub.com/guides/cruises-from-the-uk http://www.bolsovercruiseclub.com/cruise-deals/norwegian-star-europe-cruise-deals Not sure why...? Also, pages that are on our 'Cruise Reviews' section such as this page; http://www.bolsovercruiseclub.com/cruise-reviews/p&o-cruises/adonia/cruising/931 ...are being flagged as duplicated content with a page like this; http://www.bolsovercruiseclub.com/destinations/cruise-breaks-&-british-isles/bilbao/ Is this a 'thin content' issue i.e. 2 pages have 'thin content' and are therefore duplicated? If so, the 'destinations' page can (and will be) rewritten with more content (and images) but the 'cruise reviews' are written by customers and so we are unable to do anything there... Hope that all makes sense?! Andy

    | TomKing
    0

  • Our current product pages markup only have the canonical URL on the first page (each page loads more user reviews). Since we don't want to increase load times, we don't currently have a canonical view all product page. Do we need to mark up each subsequent page with its own canonical URL? My understanding was that canonical and rel next prev tags are independent of each other. So that if we mark up the middle pages with a paginated URL, e.g: Product page #1http://www.example.co.uk/Product.aspx?p=2692"/>http://www.example.co.uk/Product.aspx?p=2692&pageid=2" />**Product page #2 **http://www.example.co.uk/Product.aspx?p=2692&pageid=2"/>http://www.example.co.uk/Product.aspx?p=2692" />http://www.example.co.uk/Product.aspx?p=2692&pageid=3" />Would mean that each canonical page would suggest to google another piece of unique content, which this obviously isn't. Is the PREV NEXT able to "override" the canonical and explain to Googlebot that its part of a series? Wouldn't the canonical then be redundant?Thanks

    | Don34
    0

  • My website is built around a template, the hosting site say I can only add code into the body of the webpage not the header, will this be ok for rel=canonical If it is my next question is redundant but as there is only one place to put it which urls do I need to place in the code http://domain.com, www.domain.com or http://www.domain.com the /default.asp option for my website does not seem to exist, so I guess is not relevant thanks

    | singingtelegramsuk
    0

  • Hello friends need help 🙂
    Site age : 2 years but constructed 2-3 months ago.
    Yesterday suddenly ranking for all keywords dropped (even for brand name). 1st page to 8-9 page
    Niche : mp3/videos download
    One of my competitors also has same problem. (Their site is 3 yr old). Common between us :-
    1.Meta description same for all pages,
    2.Having nofollow links (Example user landed on page A, page A has nofollow links to page B (download links to file exist).
    This happen first time with me can anybody help with his knowledge.. Someone suggested me to do a 301 to new domain.

    | gurpreet1234
    0

  • I have a site that first got indexed about a year ago for several months.  I shut it down and opened it up again about 5 months ago.  2 months ago I discovered that the upper case in the pages:  www.site.com/some-detail/Bizname/product  was a no-no.  There are no backlinks to these pages yet, so for search engines I put in 301 redirects to the lower case version thinking after a few weeks Bing and Google would figure it out and no longer try to crawl them.  FYI there are thousands of these pages, and they are dynamically created. Well, 2 months later google is still crawling the upper case urls even though it appears that only the lower case are in the index (from when I do a site:www.site.com/some-detail ) search. Bing is also crawling the upper case although I'm not seeing any of the upper case pages and only a small percentage of the lower case ones show using a site:www.site.com/details.... command Assuming there are no backlinks will they eventually stop crawling those uppercase pages?   If not, again assuming there are no backlinks, should I 410 the upper case pages, or will that remove any credit I am getting for the page having existed for over a year prior to changing the upper to lower?

    | friendoffood
    0

  • Hey there Mozzers, Another question about a forum issue I encountered. When a forum thread has more than just one page as we all know the best course of action is to use rel="next" rel="prev" or rel="previous" But my forum automatically creates another line in the header called Rel="self" What that does is simple. If i have 3 pages http://www.example.com/article?story=abc1
    http://www.example.com/article?story=abc2
    http://www.example.com/article?story=abc3 **instead of this ** On the first page, http://www.example.com/article?story=abc1 On the second page, http://www.example.com/article?story=abc2 On the third page, http://www.example.com/article?story=abc3: it creates this On the first page, http://www.example.com/article?story=abc1 So as you can see it creates a url by adding the ?page=1 and names it rel=self which actually gives back a duplicate page because now instead of just http://www.example.com/article?story=abc1 I also have the same page at http://www.example.com/article?story=abc1?page=1 Do i even need rel="self"? I thought that rel="next" and rel="prev" was enough? Should I change that?

    | Angelos_Savvaidis
    0

  • Hi all, My site seems to have www.xyz.com, http://www.xyz.com, http://xyz.com and other variations! From an old agency doing this. All showing differing backlinks etc. SO I want to merge them so I can just look at one analytics account - with everything combined. I want it just to consolidate all to https:///www.xym.com as the client wants - how do I do this? Does it take long to take effect?? Also I presume in webmaster I'll have to set up the preferred extension? Thanks very much for any advice 🙂

    | VMLQais
    0

  • I have two websites that are extremely similar and want to consolidate them into one website by pointing both domain names at one website.  is this going to cause any duplicate content penalties by having two different domain names pointing at the same site?  Both domains get traffic so i don't want to just discontinue one of the domains.

    | Ron10
    0

  • Hello, I have a question about a website I have been asked to work on.  It is for a real estate company which is part of a larger company. Along with several other (rival) companies it has a website of property listings which receives a feed of properties from a central hub site - so lots of potential for page, title and meta content duplication (if if isn't already occuring) across the whole network of sites. In early investigation I don't see any of these sites ranking very well at all in Google for expected search phrases.  Before I start working on things that might improve their rankings, I wanted to ask some questions from you guys: 1. How would such duplication (if it is occuring) effect the SEO rankings of such sites individually, or the whole network/hub collectively? 2. Is it possible to tell if such a site has been "burnt" for SEO purposes, especially if or from any duplication? 3. If such a site or the network has been totally burnt, are there any approaches or remedies that can be made to improve the site's SEO rankings significantly, or is the only/best option to start again from scratch with a brand new site, ensuring the use of new meta descriptions and unique content? Thanks in advance, Graham

    | gmwhite999
    1

  • I have recently had a HUGE SEO issue costing my organization 250K a month in sales - looking under all rocks now - this is something that has existed and I cannot find any issues related with the self-publishing website, but thought y'all may know.

    | michelle888
    0

  • We're on a .uk.com subdomain (scientifica.uk.com). Now, at the end of each page title, Google is automatically adding 'Scientifica - UK.COM'. So a lot of our results now appear as [Page title] - Scientifica - UK.COM It looks a bit a mess and I'm worried about our CTR. Can anyone shed some light on this or know how to stop it? Thank you WbCO0BM

    | JamesPearce
    0

  • My least favorite part of SEO 😉 I'm trying to redirect an old url that no longer exists to our new website that is built with https. The old url: http://www.thinworks.com/palm-beach-gardens-team/ New url: https://www.thinworks.com/palm-beach-gardens/ This isn't working with my standard process of the quick redirection plugin in WP or through htaccess because the old site url is at http and not https. Any help would be much appreciated! How do I accomplish this, where do I do it and what's the code I'd use? Thank you Moz community! Ricky

    | SUCCESSagency
    0

  • Hello, In our e-commerce business we saw significant growth in organic traffic during the summer months.
    This is a period we sell a lot of custom made photobooks. The weird thing is that we didn't do any changes to our photobook pages since 2013, where we didn't have these peaks.
    So this is good news, but... What would be a good way to start investigating the reason of this increase? 
    I have a hard time answering the 'why' question. Thank you very much for your help! With kind regards, Yannick TiWWKX9.png

    | ETonnard
    0

  • I have no idea why or what their mistake/intent would be, but my mom's artist website (kathleenmrobison.com) has been link/anchor text spammed for NFL jerseys - so weird. As seen in SEMrush, her site is actually ranking for some of these keywords - but we don't want/need these at all. Do we proactively disavow all of these sites in with the disavow file, or just ignore until we get problems with warnings? **Edit: **I also see that some fake URLs have been created, so it is definitely a spam/hacked issue. 6495ws 6496zi

    | Joe.Robison
    0

  • We have 2 ecommerce sites that are 95% identical. Both sites carry the same 2000 products, and for the most part, have the identical product descriptions. They both have a lot of branded search, and a considerable amount of domain authority. We are in the process of changing out product descriptions so that they are unique. Certain categories of products rank better on one site than another. When we've deployed unique product descriptions on both sites, we've been able to get some double listings on Page 1 of the SERPs. The categories on the sites have different names, and our URL structure is www.domain.com/category-name/sub-category-name/product-name.cfm. So even though the product names are the same, the URLs are different including the category names. We are in the process of flattening our URL structures, eliminating the category and subcategory names from the product URLs: www.domain.com/product-name.cfm. The upshot is that the product URLs will be the same. Is that going to cause us any ranking issues?

    | AMHC
    0

  • Hey there Webmasters of the Universe. So i have this problem with my forum. The platform I am using it automatically creates extra pages for every page. For exampleIf my forum had one page called forum.com/examplethe same page you can find at forum.com/example?page=1If I set rel canonical into the second one pointing to the first one will that cause a problem for me?Thanks in advance!

    | Angelos_Savvaidis
    0

  • If the title didn't put you off please read on! 🙂 According to our latest Moz Crawl Report we have circa 700 instances of duplicate Meta Description on pages that are both dynamically created and also paginated, however, I believe that number to be greater! We are unable to manual make changes to these pages (because they are dynamic) and so we need ask our web devs to create a change in how the Meta is created... If I am not making myself clear (and there is a good chance that I'm not!) then here is an example of what I mean; http://www.bolsovercruiseclub.com/cruise-deals/silversea-cruise-deals/ There are 92 pages of cruise deals for this particular operator with the results of each page having the option to sort by 4 categories; Recommended Cruise Price Sail Date Best Value 4 x 92 = 368 instances just for this one operator! The current Meta Desc is; A selection of the best Silversea cruise deals taking in over 800 destinations across all 7 continents. ...which isn't great I know! The problem is how to make each page (in each category) unique If any of you have incurred anything similar and have any kind of solution or recommendation then please respond - I would be most grateful! Andy

    | TomKing
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.