Moz webfonts do not render correctly ...
-
Actually I have problems with the rendering of webfonts on all browsers. As moz uses 100% Webfonts I hardly can read text.
Anybody had this problem before?
please help!
THX
attached a screen to see what it looks like...
-
O.K. finally I can answer this question myself:
I installed lately a True Type Font: ufonts.com_helvetica_neue_ultralight.ttf
which affected many browser fonts that I hardly could not use moz anymore. So problem is solved.Never install the font mentioned above !
-
Hi,
thank you, I wrote the problem is on all browsers similar. I attached another screenshot showing Chrome, Safari, FF and IE. And not all Webfonts are problematic as the type "Moz Q&A Forum" is well..
-
Holger,
You did not say what internet browser you are using. The problem probably stems from the browser default font being something difficult to read.
In Chrome click on the upper right corner hash box, choose "Settings" then scroll to the bottom of the list, and open the advanced settings. Here is where you may specify the font style, and size. Internet explorer is very similar.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there any proof that google can crawl PWA's correctly, yet
At the end of 2018 we rolled out our agency website as a PWA. At the time, Google used Chrome (41) headless to render our website. Although all sources announced at the time that it 'should work', we experienced the opposite. As a solution we implement the option for server side rendering, so that we did not experience any negative effects. We are over a year later. Does anyone have 'evidence' that Google can actually render and correctly interpret client side PWA's?
Web Design | | Erwin000 -
What do you use for test rendering your dev site?
I'm redesigning our company ecommerce site and need to test render an infinite scroller to ensure that it is as SEO friendly as possible. My problem is that I cannot view it in Webmaster Tools since I am blocking the site from crawlers using robots.txt. I know I could simply unblock Google temporarily but I really would rather not make my dev site available to search engine crawlers.
Web Design | | bearpaw0 -
Moz crawl showing up ?s=keyword pages as errors
Hi all, Hoping someone can she some light on a fix with ref to wordpress and the search function it uses as Moz is craling some pages which reference the search domain.com/?s=keyword Errors showing up are duplicate pages, descriptions and titles. The search function is not important on this site and I have tried to use a plugin which disables the search page which it does but these errors still show up. Can anyone assist as this is the final piece of the puzzle and then we're down to 0 issues on the site.
Web Design | | wtfi0 -
Were our URLs setup correctly?
The person who build our site setup a lot of the pages like: domain/location/city/title tag For example: http://www.kempruge.com/location/tampa/tampa-personal-injury-legal-attorneys/ I know the length is too long and it seems entirely unnecessary to me. Many of the pages I have created since I got here are just domain/title tag (which is almost always city-field of law-attorneys-lawyers). However, when I compare the original pages with the new ones, they both rank similarly. Given what a pain it is to change urls, I'm not sure if it would be worth it to shorten them all or not. However, I would like to know if the way there were setup originally makes sense for some reason I don't understand. Thanks, Ruben
Web Design | | KempRugeLawGroup1 -
We believe we accomplished an SEO Parallax site with a nice balance. Can the MOZ community critique this site from an SEO perspective?
Our goal was to accomplish a site that has parallax scrolling and great onsite optimization. We noticed that most Awwward winning sites www.awwwards.com have great parallax scrolling but no SEO. Can the MOZ community critique this site from an SEO perspective? (Note this site was optimized for Chrome or Firefox. If you are using IE, you will be redirected to the old site.) www.posicionamientowebenbuscadores.com Note the site is in BETA still. It has the following technologies CSS3 HTML5 REsponsive Wordpress Parallax Scrolling Onsite Optimization (SEO) No mobile (ran out of funds...)
Web Design | | Carla_Dawson0 -
Hey on some of my report cards its saying im not using rel canonical correctly how do i change this on my site?
on some of my report cards its saying certain things featured on my services page are actually linking to my blog or something. and its saying im not using rel canonical correctly. can you help me out?
Web Design | | ClearVisionDesign0 -
Duplicate Page Content mysite.com and mysite.com/index.html MOZ Dashboard
According to MOZ Dashboard my site shows Duplicate Page Content mysite.com and mysite.com/index.html .What i can do for that .redirect mysite.com/index.html to mysite.com .then how can i do that using .htaccess file .
Web Design | | innofidelity0 -
Off Screen Rendering & Other Tactics
Hi Folks, We're currently trying to change our website search results to render in HTML in the first instance then switch off to AJAX when our user clicks on filters. But we came across an issue that diminishes the user experience, so we used this method below: We have moved the search grid offscreen in the initial rendering because we execute a lot of Javascript that modifies the DOM within the grid. Also when a user has performed a search from within the page, the hash is updated to identify the new search terms. Because this is not sent to the server, a user who has done a search and refreshes would see incorrect search results initially and the correct search results would then replace them. For example, on initial search a user reaches a URL akin to search.veer.com/chicken. When they perform a search from on that page, the hash gets updated tosearch.veer.com/chicken#keyword=monkey. If the user refreshes the page, the server only receives the request for chicken and then serves up the page with those results rendered on it. The Javascript then checks the hash and determines that it needs to run a different search and fires off an AJAX call to get the new results. If we did not render the results offscreen the user would see the results for chicken (confusingly) and be able to briefly interact with them until the AJAX call returns and the results are replaced with the correct monkey results. By rendering offscreen, the initial results are not visible and the Javascript can move them immediately onscreen if there is no hash, or wait until the AJAX call returns and then rebuild the grid and move it onscreen. Now I know that rendering text indent to -9999 is a black hat SEO tactic. But, would it be the same in this case? We're only doing this avoid bad UI. Please advise. Also, we came across these two articles that may serve alternative options. These article state that each tactic is SEO-friendly, but I'd like to run it my the community and see if you guys agree. http://joshblog.net/2007/08/03/make-your-rich-internet-application-seo-friendly/ http://www.inqbation.com/tools-to-increase-accessibility-in-the-web/ Thank you for your help!
Web Design | | CorbisVeer0