Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hello all, If a company acquires a smaller company and 'absorbs' its products and services into its own website, what is the protocol with closing down the smaller company's site? So far we added our branding to the site alerting their visitors to the imminent takeover, and 301 redirected certain pages - soon we'll be redirecting all the pages to their counterparts on the main website. Once that's done, should we noindex the old site? Anything else? Thanks, Caro

    | Caro-O
    0

  • While doing my website crawl, I keep getting the message that I have tons of duplicated pages.
    http://example.com/index.php and http://www.example.com/index.php are considered to be the duplicates. As I figured out this one: http://example.com/index.php is a canonical page, and I should point out this one: http://www.example.com/index.php to it. Could you please let me know if I will do a right thing if I put this piece of code into my index.php file?
    ? Or I should use this one:

    | kirupa
    0

  • Hi Everyone I have a website and am slowly getting to grips with SEO. Last week I enabled a new channel in google analytics which was "email" so I could track effectiveness of the weekly emails we send out. The good news is that a ton of traffic is now being assigned to the email "channel" in GA but my organic search traffic in channels is now down week on week. That feels odd as my overall traffic to the site is up, week on week. Does anyone have any experience of new channels  coming on stream and canniballising old ones? Could it be that some of the traffic associated to organic search previously was actually coming from my email, I just didn't know it? thanks all!

    | NappyValleyNet
    1

  • We're storing all our images in S3 bucket, common practice, but we want to get these images to drive traffic back to our site -- and credit for that traffic. We've configured the URLs to be s3.owler.com/<image_name>/<image_id>. I've not seen any of these images show in our web master tools. I am wondering if we're actually not going to get the credit for these images because technically they do sit on another domain. </image_id></image_name>

    | mindofmiller
    0

  • Hi, i take over the SEO of a quite large e-commerce-site. After checking crawl issues, there seems to be +3000 4xx client errors, +3000 duplicate content issues and +35000 temporary redirects. I'm quite desperate regarding these results. What would be the most effective way to handle that. It's a magento shop. I'm grateful for any kind of help! Thx,
    boris

    | posthumus
    0

  • Hi, Best tool to find Related keywords with a Keyword provided. Basically i want to give a keyword and find all related keywords we can use to write articles. Also any way we can find what keyword a page is getting traffic based on? Thanks

    | skandlikp9
    0

  • Hey all, Was wondering if anyone else has come across this issue. Bing is showing title and description tags missing in the head of my wordpress blog. I can't seem to find any documentation on this. Thanks, Roman

    | Dynata_panel_marketing
    0

  • Hey All — I’m working with a site that is migrating to HTTPS and had a couple questions. I read Moz’s ‘SEO Tips & Tricks for HTTPS’ post but want some clarification on a couple items. Aside from using https canonicals... 1. What is the best way to preserve link equity from inbound links? Site wide 301 Redirect in .htacess? 2. What is the best way to redirect internal links from http to https? The site uses absolute internal links. THX!

    | JJLWeber
    0

  • We have a page set up that has anchor text with header tags. There is an instance where the same anchor text is on the page twice linking to the same page, and I know that Google will ignore the second instance. But in the second instance it also had an H2 tag (which I removed and put it on the first instance of anchor text even though it's smaller). Is this good practice?

    | AliMac26
    0

  • Hi Team, I have made  my new website live. But while checking in Google it is not showing in search result  ( site: www.oomfr.com  ). Can anybody please advice.

    | nlogix
    0

  • We have a healthcare website which lists doctors based on their medical speciality. We have a paginated series to list hundreds of doctors. Algorithm: A search for Dentist in Newark locality of New York gives a result filled with dentists from Newark followed by list of dentists in locations near by Newark. So all localities under a city have the same set of doctors distributed jumbled an distributed across multiple pages based on nearness to locality. When we don't have any dentists in Newark we populate results for near by localities and create a page. The issue - So when the number of dentists in New York is <11 all Localities X Dentists will have jumbled up results all pointing to the same 10 doctors. The issue is even severe when we see that we have only 1-3 dentists in the city. Every locality page will be exactly the same as a city level page. We have about 2.5 Million pages with the above scenario. **City level page - **https://www.example.com/new-york/dentist - 5 dentists **Locality Level Page - **https://www.example.com/new-york/dentist/clifton, https://www.example.com/new-york/dentist/newark - Page contains the same 5 dentists as in New York city level page in jumbled up or same order. What do you think we must do in such a case? We had discussions on putting a noindex on locality level pages or to apply canonical pointing from locality level to city level. But we are still not 100% sure.

    | ozil
    0

  • I have a client in a niche building industry that provides 4 different services to them. She has provided me with a list of 131 past clients of hers that she wants hyperlinked on her site to theirs. The logic is that a lot of these clients are heavy hitters and quite impressive to their peers so the links will be reinforcing my client's value. Is there a best practice for determining whether the link should be follow/no follow? Should I be checking the client's site's spam score, page rank, anything else? Some of these 131 links will be duplicated due to the client performing more than one service for them.

    | JanetJ
    1

  • Hey crew! First off this is a last resort asking this question here. Godaddy has not been able to help so I need my Moz Fam on this one. So common problem My crawl report is showing I have duplicate home pages www.answer2cancer.org and www.answer2cancer.org/home.html I understand this is a common issue with apache webservers which is why the wonderful rel=canonical tag was created! I don't want to go through the hassle of a 301 redirect of course for such a  simple issue. Now here's the issue. Godaddy website builder does not make any sense to me. In wordpress I could just go add the tag to the head in the back end. But no such thing exist in godaddy. You have to do this weird drag and drop html block and drag it somewhere on the site and plug in the code. I think putting before the code instead of just putting it in there. So I did that but when I publish and inspect in chrome I cannot see the tag in the head! This is confusing I know. the guy at godaddy didn't stand a chance lol. Anyway much love for any replies!

    | Answer2cancer
    0

  • Background: A company I am working with recently consolidated content from several existing domains into one new domain. Each of the old domains focused on a vertical and each had a number of product pages and a number of blog pages; these are now in directories on the new domain. For example, what was www.verticaldomainone.com/products/productname is now www.newdomain.com/verticalone/products/product name and the blog posts have moved from www.verticaldomaintwo.com/blog/blogpost to www.newdomain.com/verticaltwo/blog/blogpost. Many of those pages used to rank in the SERPs but they now do not. Investigation so far: Looking at Search Console's crawl stats most of the product pages and blog posts do not appear to be being indexed. This is confirmed by using the site: search modifier, which only returns a couple of products and a couple of blog posts in each vertical. Those pages are not the same as the pages with backlinks pointing directly at them. I've investigated the obvious points without success so far: There are a couple of issues with 301s that I am working with them to rectify but I have checked all pages on the old site and most redirects are in place and working There is currently no HTML or XML sitemap for the new site (this will be put in place soon) but I don't think this is an issue since a few products are being indexed and appearing in SERPs Search Console is returning no crawl errors, manual penalties, or anything else adverse Every product page is linked to from the /course page for the relevant vertical through a followed link. None of the pages have a noindex tag on them and the robots.txt allows all crawlers to access all pages One thing to note is that the site is build using react.js, so all content is within app.js. However this does not appear to affect pages higher up the navigation trees like the /vertical/products pages or the home page. So the question is: "Why might product and blog pages not be indexed on the new domain when they were previously and what can I do about it?"

    | BenjaminMorel
    0

  • We have a relatively new site and I have noticed recently that Google seems to be indexing both the mobile and the desktop version of our site. There are some queries where the mobile version will show up and sometimes both mobile and desktop show up. This can't be good. I would imagine that what is supposed to happen is that the desktop version is the one that should be indexed (always) and browser detection will load the mobile version where appropriate once the user is on the site. Do you have any advice on what we should do to solve this problem as we are a bit stuck?

    | simonukss
    0

  • Hi I am not sure where this came from ... ?escaped_fragment= But in webmaster we are seeing hundreds of pages with this and thus webmaster is saying that we have Pages with duplicate title tags How do I fix this, or remove it. Regards T

    | Taiger
    0

  • Hi, Title tags of our website are being truncated by Google even though they can be very short (sometimes < 40 characters) and with very few capital letters. We would like to understand why. Example: Principal component analysis (pca) in abcde - OurBrand shows up as: Principal component analysis (pca) in abcd... - OurBrand where abcde is the name of a very common software (5 characters), and OurBrand is a 6 characters long string (could be used in either lower case or upper case). Even when removing the brackets around pca, truncation still occurs... Any clue why?

    | trigaudias
    1

  • Sorry the title may not make the most sense as I'm not entirely sure what my question would be phrased as. https://developers.google.com/structured-data/breadcrumbs#examples We have breadcrumbs on our site, these are generated by a plugin. So for example we have: Where am I: Homepage Page 1 [Page 2](../../Page 2 "Page 2") <a id="ctl00_RptBreadcrumbs_ctl04_link" title="Page 3">Page 3</a> Do we have any way where we can implement this without development being involved? Alternatively is there anyway to use the current url? (as we do use folders) so an example being: http://domain.com/page1/page2/page3 Probably not possible but I live in hope.

    | ThomasHarvey
    0

  • We have recently made updates to our xml sitemap and have split them into child sitemaps. Once these were submitted to search console, we received notification that the all of the child sitemaps except 1 produced 404 errors. However, when we view the xml sitemaps in a browser, there are no errors. I have also attempted crawling the child sitemaps with Screaming Frog and received 404 responses there as well. My developer cannot figure out what is causing the errors and I'm hoping someone here can assist. Here is one of the child sitemaps: http://www.sermonspice.com/sitemap-countdowns_paged_1.xml

    | ang
    0

  • Hi, 
    I have a website , and having 1000 above url and all the url already got indexed in Google . Now am going to stop all the available services in my website and removed all the landing pages from website. Now only home page available . So i need to remove all the indexed urls from Google . I have already used robots txt protocol for removing url. i guess it is not a good method for adding bulk amount of urls (nearly 1000) in robots.txt . So just wanted to know is there any other method for removing indexed urls. 
    Please advice.

    | nlogix
    0

  • We've acquired another company and want to redirect their soon-to-be-obsolete website to ours. It includes a blog with many blog posts. Should we: only 301 redirect the top level blog URL
    try redirect individual blogs to blogs of a similar topic on our site (least practical I'm sure)
    redirect all their individual posts to our main blog URL Thanks, Caro

    | Caro-O
    1

  • Hi MOZers, I have an ecommerce website with its mobile version of the site sitting in a subdomain. It is going to be transferred to another subdomain on the same website. How do I ensure that I save/carry over most of the traffic, authority and equity to the new sudomain? At the moment, we are not looking to get rid of the subdomain but maybe later, yes. Malika

    | Malika1
    0

  • Let's put on a scenario for a Job Classified site, So far the way we are handling xml sitemaps is in a consecutive number containing only ads historically: http://site.com/sitemap_ads_1.xml http://site.com/sitemap_ads_2.xml http://site.com/sitemap_ads_99.xml Those sitemaps are constantly updating as each ad is published, keeping expired ads but I'm sure there is a better way to handle them. For instance we have other source of content besides ads pages, like those related to search results (Careers, Location, Salary, level, type of contract, etc) and blog content, but we are not adding them yet So what I'm suggesting is to reduce the amount of xml sitemaps ads to just one, including just the ones that are active (not expired), add another xml sitemap based on search results, another one on blog content, another one on images and finally one for static content such as home, faq, contact, etc. Do you guys think this is the right way to go?

    | JoaoCJ
    0

  • Hi guys, tl:dr - Should articles discussing a company's event (offline content) be nofollow? My company hosts a number of events across the year, during which we invite a selection of bloggers, journalists and interested parties from across the UK. During these events we show them the "behind the scenes" of our company as well as the manufacturing process and give them an amazing experience surrounded by our products. We never (ever) ask for write-ups or links, and leave the day entirely open every time. If people ask about articles or links, we always say it's entirely up to them if they wish to talk about their experiences. So, my question is: should any follow-up articles (for example reviews of the day, which bloggers will want to talk about) be nofollow? They're not reviewing any products, nor have they been paid or incentivised to talk about their experience. One could argue the event itself is incentive, however if this is the case then surely providing content is equally incentivising... The only difference is that the content we're providing is offline? Would be good to get people's thoughts on this!

    | JAR897
    0

  • Hi, The company which I work for have developed a new website for a customer, there URL is https://www.wideformatsolutions.co.uk I've created a sitemap which has 25,555 URL's. I submitted this to Google around 4 weeks ago and the most crawls that have ever occurred has been 2,379. I've checked everything I can think of, including; Speed of website Canonical Links 404 errors Setting a preferred domain Duplicate content Robots Txt .htaccess Meta Tags I did read that Matt Cutts revealed in an interview with Eric Enge that the number of pages Google crawls is roughly proportional to your pagerank. But I'm sure it should crawl more than 2000 pages. The website is based on Opencart, if anyone has experienced anything like this I would love hear from you.

    | chrissmithps
    0

  • Hi Im SEO'ing a Shopify site (new/not yet live) at the moment and all the products are in a 'Products' subfolder along the lines of: domain.com/products/blue-widgets/ etc I understand that many ecommerce SEO's these days go 'Flat Navigation' with all products 'off the root' rather than in a sub folder. Then they communicate product & categories/departmental relationships via breadcrumbs & other internal linking etc In the case of a platform like Shopfy is this a good idea or is it best to leave 'as is' and the 'Products' subfolder is a perfectly good place for the product pages ? All Best Dan

    | Dan-Lawrence
    0

  • Seeing a lot of duplicate content instances of seemingly unrelated pages. For instance, http://www.rushimprint.com/custom-bluetooth-speakers.html?from=topnav3 is being tracked as a duplicate of http://www.rushimprint.com/custom-planners-diaries.html?resultsperpg=viewall. Does anyone else see this issue? Is there a solution anyone is aware of?

    | ClaytonKendall
    0

  • Is a URL like the one below going to hurt SEO for this page? /healthcare-solutions/healthcare-identity-solutions/laboratory-management.html I like the match the URL and H1s as close as possible but in this case it looks a bit funky. /healthcare-solutions/healthcare-identity-solutions/laboratory-management.html

    | jsilapas
    0

  • My site is experiencing a decrease in organic traffic WOW for the last two weeks and for the first time all year is showing a decrease compared to last year's traffic for the same weeks. At first I thought this was a seasonal pattern due to spring break (we are mostly b to b), but the dip has sustained for another week. The only changes made during this time period were a few on-page updates and some title tag updates to a specific group of pages. However, the decrease is sitewide including branded clicks and impressions. I haven't noticed any changes in rankings. Impressions and clicks are down per Search Console, but CTR and Avg Rank haven't changed. Is it possible that we've been penalized or hit by an algo shift? What's the best way to know for sure? VLGLUTt

    | cckapow
    0

  • I am doing SEO for an appliance repair company. Their company website's domain doesn't have high authority, and I am going to increase that by link earning and content improving. I think a better domain name might also help me out. The current URL contain the word "appliance" but doesn't have "repair" in it. I am thinking a new domain that would contain both keywords will serve better. Could you please share with me your thought on this? Am I in the right direction, or not at all? I know Google penalizes mirror sites since this they are considered as duplicated content. I'll upload my content to the new domain and make the old one point to that new URL. I am wondering if canonical might help? Or 301 redirect will be a better solution? Any advise would be highly appreciated! Thank you!

    | kirupa
    0

  • The site:domain.com search seems to show less pages than it used to (Google and Bing). It doesn't relate to a specific site but all sites. For example, I will get "page 1 of about 3,000 results" but by the time I've paged through the results it will end and change to "page 24 of 201 results". In that example If I look in GSC it shows 1,932 indexed. Should I now accept the "pages" listed in site: is an unreliable metric?

    | bjalc2011
    2

  • Currently this is what applies throughout the site. property="og:locale" content="en_GB" /> How would one set this for properties in Italy or Spain for example? (The language is all in English) Regards Tai

    | Taiger
    0

  • Hi, Product pages on our site have a couple of elements that are lazy loaded / loaded after user action. Apart from images which is a widely discussed topic in lazy loading, in our case Videos & Price Graphs are lazy loaded. For videos we do something that Amit Agarwal recommended here: http://labnol.org/internet/light-youtube-embeds/27941/ - We load a thumbnail and a play button over it. When a user clicks that play button, the video embedd form Youtube would load. However we are not sure if Google gets that and since the whole thing is under a H3 tag, will we a) loose out benefit of putting a relevant video there b) send any negative signals for only loading a image thumbnail under an h3 tag? We also have price graph, that lazy loads and is not seen when you see a cached version of our page on Google. Are we losing credit (in Google's eyes) for that content on our page? Sample page which has both price history graph & video http://pricebaba.com/mobile/apple-iphone-6s-16gb Appreciate your help! Thanks

    | Maratha
    0

  • At the moment I see that the Data Highlighter supports a few schemas such as Events, Products, Articles, Restaurants etc. My client has a training division and runs regular training courses so we want to highlight those using the tool as "Training Courses," but this is not an option. Does anyone know why the tool doesn't support more categories and if there are plans to expand what it supports? I realise it would be better to use actual HTML markup onto the client site but their website is administered by their corporate parent in another country and they are not prepared to add in a Wordpress plugin to allow this. But the UK division, that we work for, wants to use it. We have restricted access to the Wordpress site so we don't have the access rights to add in plugins ourselves otherwise it would be no problem to do this.

    | mfrgolfgti
    1

  • Hi, Does anyone have experience moving a website to https? I am about to do so. I have 84 linking root domains and around 2k+ external links. If i move a website to https will these links be lost? And how to keep these links? Many thanks, Dusan

    | Chemometec
    0

  • Hi all, Since I have moved from http to https my pagespeed has dropped. My hosting company/ programmer say this is normal but I am not happy with the results. Before: mobile 69 desktop 89 Now: Mobile: 52 desktop 69 Anybody experience with this issue? All help is welcome! Thank You in advance Tymen

    | Tymen
    0

  • Hi, Looking for some advise.  I have a local business website that was built and managed by a web developer.  The site was/is very basic and really was only there as a place for potential customers to visit after finding out about us via more traditional local marketing. I decided to make the website work for us more and improve the SEO etc to get it ranking better and finding us customers rather than us sending customers to the website. Long story short I wanted to change from an HTML site to a wordpress site to enable me more control over updates/blogging etc.  Web developer said he only works with HTML, so I decided to go it alone.  As things stand the website hasn't been changed and still remains hosted by the developer but on the 12th of February he transferred the domain to me. Now I'm not sure exactly what my DA was in February but it was at least double figures but now it is 1.  As I said only thing that has changed as far as I'm aware is the transfer of the domain to me. I'm at the point were I'm close to doing the transfer over to wordpress.  Been working on keywords, content etc to make things better but then noticed my issue.  Anybody have any ideas why this would have happened?  Or the process I can go through to find the route of the problem before I continue with the change over? Thanks

    | icandoit
    0

  • In these days of CDNs does it matter for SEO whether images (and PDFs etc.) are hosted off-site? Does it make a difference if images hosted on Flickr, photobucket etc. Thanks

    | bjalc2011
    0

  • I recently updated a site to use a new WordPress theme. This theme in conjunction with Yoast SEO plugin is causing duplicate Title tags. The tags have the same title in them. I discovered this when adding new keywords and pages to the page optimizer in moz. I have since turned off moz to stop the duplicate title tag issue however I am wondering if tuning off yoast is worse then the duplicate title tags. Any clarification on the duplicate page title issue and its consequences would be greatly appreciated.

    | donsilvernail
    0

  • What are the implications if a web designer codes the content of the site twice into the page in order to make the site responsive? I can't add the url I'm afraid but the H1 and the content appear twice in the code in order to produce both a responsive version and a desktop version. This is a Wordpress site. Is Google clever enough to distinguish between the 2 versions and treat them individually? Or will Google really think that the content has been repeated on the same page?

    | Wagada
    0

  • We just transitioned mccacompanies.com to confluentstrategies.com. The problem is that when I search for the old name, the old website doesn't come up anymore to redirect people to the new site. On the local card, Google has even taken off the website altogether. (I'm currently still trying to gain access to manage the business listing) When I search for confluent strategies, the website doesn't come up at all. But if I use the site: operator, it is in the index. Basically, my client has effectively disappeared off the face of the Google. (In doing other name changes, this has never happened to me before) What can I do?

    | MichaelGregory
    0

  • Hello, I jumped aboard as SEO for a client, who seems to of had been hit by panda and penguin back in 2012 of April, the panda part I feel I've fixed by creating better content, combining pages that were same topic into one, basically creating a better content experience that relates better to search terms users are searching for. Once the site was redesigned and relaunched all keywords improved minus one, the main keyword they want to rank for. Created a landing page for it, that is very nicely optimized for that keyword and it's brothers and sisters, however that page isn't used by google since it's brand new with a PA of 1. Doing a backlink audit I found 102 links out of 400 using the same anchor text as the keyword they want ranked for, they also have synonyms anchor text for other links too but not quite as much. Most of those 102 domains using the main keyword anchor text are directories, in my opinion I'd declare all of them spam, however there are a few with DAs higher than 50, making me little more nervous to disavow, since I want to make sure we get out of the penalty if we were hit by penguin but also don't want to ruin the ranking for other keywords we're doing better with, since they are longtails and short, but very relevant to users. How is the best way to determine if a site / directory is spammy enough that it's penalizing you and how could I approach the anchor text issue with backlinks? 99% of these links I cannot have changed, since they're directories I doubt many have had a human mess with them in a while. Sidenote* If you're going to post a link as a response, try to summarize what that link will be about, as many times links are giving as an answer but end up not really providing the meat we were seeking. Thank you!

    | Deacyde
    0

  • Hello Mozers, Our sitemaps were submitted to Google and Bing, and are successfully indexed. Every time pages are added to our store (ecommerce), we re-generate the xml sitemap. My question is: should we be resubmitting the sitemaps every time their content change, or since they were submitted once can we assume that the crawlers will re-download the sitemaps by themselves (I don't like to assume). What are best practices here? Thanks!

    | yacpro13
    1

  • Hi guys, I'm having a bit of an issue on a client site that I'm hoping someone can help me with. Basically, the client has two domains, one serving users in the Republic of Ireland (http://www.americanholidays.com), showing Euro prices, and the other serving users in Northern Ireland (http://www.americanholidays.com/gb_en/) showing £ prices. The issue I'm having is that the URL for the Northern Ireland page has a 302 on it and goes through another 2/3 301 redirects until it resolves as http://www.americanholidays.com, however it does then show the £ prices. You can see the redirect chain here: http://tools.seobook.com/server-header-checker/?page=single&url=http%3A%2F%2Fwww.americanholidays.com%2Fgb_en%2F&useragent=1&typeProtocol=11 The homepage is using the Hreflang tag, and pointing search engines to serve the http://www.americanholidays.com/gb_en/ page to users using EN-GB as their language. The page is also using a self-referencing canonical, which I believe may negate the whole Hreflang tag anyway? My main question is - is the fact that the Hreflang for the gb_en page is pointing to a chain of redirects negatively affecting it? (I understand too many redirects are never good). Also, is the canonical negating the Hreflang? Any help/info would be great as I just can't get my head around it! Thanks guys Daniel

    | DanielKiely6
    0

  • Hi all, we just launched a new version of our site and have seen huge drops in traffic to the homepage. Url structures were changed and now include /us at the end and a lot of the content was consolidated to make the homepage a little cleaner. We kept title tags and meta data the same. Rankings have declined significantly. Does anyone have any suggestions or advice?

    | perfectsearch71
    1

  • I have a client who allowed a related business to use a blog post from my clients site and reposted to the related businesses site.  The problem is the post was copied word for word. There is an introduction and a link back to the website but not to the post itself. I now manage the related business as well. So I have creative control over both websites as well as SEO duties. What is the best practice for this type of blog post syndication? Can the content appear on both sites?

    | donsilvernail
    0

  • We have a new website on a new url (been up for around 2 years now) and our old website is slowly fading in the background, we are now at the point where the money is still ok but we are having issues running both side by side, we have a calculator on each page and are thinking about removing this and adding a box with please order from our new site here (with url of similar page). Now the issue is we don't want to link for SEO purposes and google hammer us (thinking of no - following these) and we also have a penalty we got in 2012 on the site but we did get out of this, would this cause any issue to the new site?

    | BobAnderson
    1

  • Hi everyone - I've taken over SEO for a site recently. In many cases, the reasons why something was done were not well documented. One of these is that on some pages, there are lists of selections. Each selection takes the user to a particular page. On the list page, there is often a link from an image, a name, and a couple of others.  Each page often has 30 items with 4 links each.  For some reason, the 4th of these internal links were no-followed. When I run this site through several different site evaluation tools, they are all troubled with the number of no-follow links on the site. (These instances from above add up to a 5 figure number). From a user perspective, I totally get why there is a link where each of these links exist. If I wanted to click on the image or the name or some other attribute, that totally makes sense.  Its my understanding that Google / Bing are only going to consider the 1st instance. If this creates excessive links, wouldn't you want 3 of the 4 links in each set no-followed? If its only excessive unique links that really matter, then why would any be nofollowed.

    | APFM
    0

  • Hi Guys, Recently got into the issue when testing load speed of my website (https://solvid.co.uk). Occasionally, Google Speed Insights gives me a server connection error which states _"PageSpeed was unable to connect to the server. Ensure that you are using the correct protocol (_http vs https), the page loads in a browser, and is accessible on the public internet." Also, GTMetrix gives me an error as well, which states the following: "An error occurred fetching the page: HTTPS error: SSl connect attempt failed" All of my redirects seem to be set-up correctly as well as the SSL certificate. I've contacted my hosting provider (godaddy), they are saying that everything is fine with the server and the installation. Also, tried in different browsers in incognito mode, still gives me the same error. Until yesterday I haven't had such a problem. I've also attached the error screenshot links. I would really appreciate your help! Dmytro UxchPYR M52iPDf

    | solvid
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.