Moz webfonts do not render correctly ...
-
Actually I have problems with the rendering of webfonts on all browsers. As moz uses 100% Webfonts I hardly can read text.
Anybody had this problem before?
please help!
THX
attached a screen to see what it looks like...
-
O.K. finally I can answer this question myself:
I installed lately a True Type Font: ufonts.com_helvetica_neue_ultralight.ttf
which affected many browser fonts that I hardly could not use moz anymore. So problem is solved.Never install the font mentioned above !
-
Hi,
thank you, I wrote the problem is on all browsers similar. I attached another screenshot showing Chrome, Safari, FF and IE. And not all Webfonts are problematic as the type "Moz Q&A Forum" is well..
-
Holger,
You did not say what internet browser you are using. The problem probably stems from the browser default font being something difficult to read.
In Chrome click on the upper right corner hash box, choose "Settings" then scroll to the bottom of the list, and open the advanced settings. Here is where you may specify the font style, and size. Internet explorer is very similar.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Items 30 - 50", however this is not accurate. Articles/Pages/Products counts are not close to this, products are 100+, so are the articles. We would want to either hide this or correct this.
We are running into this issue where we see items 30 -50 appear underneath the article title for google SERP descriptions . See screenshot or you can preview how its appearing in the listing for the site here: https://www.google.com/search?source=hp&ei=5I5fX939L6qxytMPh_el4AQ&q=site%3Adarbyscott.com&oq=site%3Adarbyscott.com&gs_lcp=CgZwc3ktYWIQAzoICAAQsQMQgwE6BQgAELEDOgIIADoECAAQCjoHCAAQsQMQClDYAljGJmC9J2gGcAB4AIABgwOIAYwWkgEIMjAuMy4wLjKYAQCgAQGqAQdnd3Mtd2l6sAEA&sclient=psy-ab&ved=0ahUKEwjd_4nR_ejrAhWqmHIEHYd7CUwQ4dUDCAk&uact=5 Items 30 - 50", however this is not accurate and we are not sure what google algorithm is counting. . Articles/Pages/Products counts are not close to this, products are 100+, so are the articles. Anyone have any thoughts on what google is pulling for the count and how to correct this? We would want to either hide this or correct this. view?usp=sharing
Web Design | | Raymond-Support0 -
Help with 302 Temporary Redirect warning via MOZ crawl
Hi Guys, This is my first post so hopefully I'm using the forum correctly. MOZ crawl tells me that I have 35 pages with a temporary redirect The URL column displays 302 Found along with the http:// URL Redirection Location column shows the corresponding https:// URL This all seems pretty self explanatory. However, I’ve checked my .htaccess file and I can’t see any 302 references in it. I'm trying to figure out where the 302 redirects are from and how I can make them permanent Please can anyone help me out? My .htaccess looks like it needs a little tidy (there are 2 if blocks) <ifmodule mod_rewrite.c="">RewriteEngine On
Web Design | | ianalannash
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://www.mysite.com/$1 [R,L]</ifmodule> BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
RewriteCond %{HTTP_HOST} ^mysite.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L]</ifmodule> END WordPress0 -
What do you use for test rendering your dev site?
I'm redesigning our company ecommerce site and need to test render an infinite scroller to ensure that it is as SEO friendly as possible. My problem is that I cannot view it in Webmaster Tools since I am blocking the site from crawlers using robots.txt. I know I could simply unblock Google temporarily but I really would rather not make my dev site available to search engine crawlers.
Web Design | | bearpaw0 -
Google text-only vs rendered (index and ranking)
Hello, can someone please help answer a question about missing elements from Google's text-only cached version.
Web Design | | cpawsgo
When using JavaScript to display an element which is initially styled with display:none, does Google index (and most importantly properly rank) the elements contents? Using Google's "cache:" prefix followed by our pages url we can see the rendered cached page. The contents of the element in question are viewable and you can read the information inside. However, if you click the "Text-only version" link on the top-right of Google’s cached page, the element is missing and cannot be seen. The reason for this is because the element is initially styled with display:none and then JavaScript is used to display the text once some logic is applied. Doing a long-tail Google search for a few sentences from inside the element does find the page in the results, but I am not certain that is it being cached and ranked optimally... would updating the logic so that all the contents are not made visible by JavaScript improve our ranking or can we assume that since Google does return the page in its results that everything is proper? Thank you!0 -
Were our URLs setup correctly?
The person who build our site setup a lot of the pages like: domain/location/city/title tag For example: http://www.kempruge.com/location/tampa/tampa-personal-injury-legal-attorneys/ I know the length is too long and it seems entirely unnecessary to me. Many of the pages I have created since I got here are just domain/title tag (which is almost always city-field of law-attorneys-lawyers). However, when I compare the original pages with the new ones, they both rank similarly. Given what a pain it is to change urls, I'm not sure if it would be worth it to shorten them all or not. However, I would like to know if the way there were setup originally makes sense for some reason I don't understand. Thanks, Ruben
Web Design | | KempRugeLawGroup1 -
So apparently SEO moz will get us de-indexed according to a SEO company!
Each and every day i get called up from an SEO company who promises to get me top spots in Google rankings if i quickly get on their special offer they have today normally i would say "no thanks and put the phone down" but i had a bit of spare time so i indulged the guy and we got talking. After the introductions and speal about his company he was showing me what his company does and how they go about it to get me top ranks (they don't get me ranks but create a website they own which then passes leads to me- kinda clever since they could then start charging me per lead or my competitors) We continued to talk and i mentioned i used SEOmoz to check my rankings and back links etc and he told me that Google are cracking down and anyone using these types of software/websites will get their websites de indexed. This struck me as BS but i wanted to get your thoughts on the matter, i personally don't believe Google would ever do such a thing as this since it would be so easy to get your competitors websites taken down (i.e. negative seo) but its certainly a talking point.
Web Design | | GarethEJones0 -
Duplicate Page Content mysite.com and mysite.com/index.html MOZ Dashboard
According to MOZ Dashboard my site shows Duplicate Page Content mysite.com and mysite.com/index.html .What i can do for that .redirect mysite.com/index.html to mysite.com .then how can i do that using .htaccess file .
Web Design | | innofidelity0 -
Off Screen Rendering & Other Tactics
Hi Folks, We're currently trying to change our website search results to render in HTML in the first instance then switch off to AJAX when our user clicks on filters. But we came across an issue that diminishes the user experience, so we used this method below: We have moved the search grid offscreen in the initial rendering because we execute a lot of Javascript that modifies the DOM within the grid. Also when a user has performed a search from within the page, the hash is updated to identify the new search terms. Because this is not sent to the server, a user who has done a search and refreshes would see incorrect search results initially and the correct search results would then replace them. For example, on initial search a user reaches a URL akin to search.veer.com/chicken. When they perform a search from on that page, the hash gets updated tosearch.veer.com/chicken#keyword=monkey. If the user refreshes the page, the server only receives the request for chicken and then serves up the page with those results rendered on it. The Javascript then checks the hash and determines that it needs to run a different search and fires off an AJAX call to get the new results. If we did not render the results offscreen the user would see the results for chicken (confusingly) and be able to briefly interact with them until the AJAX call returns and the results are replaced with the correct monkey results. By rendering offscreen, the initial results are not visible and the Javascript can move them immediately onscreen if there is no hash, or wait until the AJAX call returns and then rebuild the grid and move it onscreen. Now I know that rendering text indent to -9999 is a black hat SEO tactic. But, would it be the same in this case? We're only doing this avoid bad UI. Please advise. Also, we came across these two articles that may serve alternative options. These article state that each tactic is SEO-friendly, but I'd like to run it my the community and see if you guys agree. http://joshblog.net/2007/08/03/make-your-rich-internet-application-seo-friendly/ http://www.inqbation.com/tools-to-increase-accessibility-in-the-web/ Thank you for your help!
Web Design | | CorbisVeer0