Errors in URL´s
-
SEOMOZ is showing quite a lot of URL Errors like this: http://trampoliny.net.pl/akcesoria/pokrowiec-basic?frontend=1825cb1eea3af8ee6ee2d96617d32ff6
All these URL´s use the parameter "?frontend=". In webmaster tools we told google not to index this parameter.
Unfortunately at the moment we cannot set this parameter as "NOINDEX". We also dont want to use a robots.txt file.
How to get rid of the URLS in Seomoz?
-
Hi Henrik,
I'm afraid that Google's crawlers are much more sophisticated than our since they have much greater resources than we do, so they may not consider those URLs to be an issue, but we are always working to make our crawler better. Unfortunately, at this time there is no way to remove crawling for certain URLs in our tool without using the robots.txt file.
I'm sorry for any inconvenience that may cause. Please let me know if you have any other questions.
-Chiaryn
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am trying to use Page Optimization feature but it is giving me error.
Hi, I am trying to track a page optimization feature for one of my project, https://www.360degreespropertyinspections.com.au for keyword: property inspections melbourne but i keep getting this below error: "Page Optimization Error There was a problem loading this page. Please make sure the page is loading properly and that our user-agent, rogerbot, is not blocked from accessing this page." I checked robots.txt file, it all looks fine. Not sure what is the problem? Is it a problem with Moz or the website?
On-Page Optimization | | Abhijay191 -
Url shows up in "Inurl' but not when using time parameters
Hey everybody, I have been testing the Inurl: feature of Google to try and gauge how long ago Google indexed our page. SO, this brings my question. If we run inurl:https://mysite.com all of our domains show up. If we run inurl:https://mysite.com/specialpage the domain shows up as being indexed If I use the "&as_qdr=y15" string to the URL, https://mysite.com/specialpage does not show up. Does anybody have any experience with this? Also on the same note when I look at how many pages Google has indexed it is about half of the pages we see on our backend/sitemap. Any thoughts would be appreciated. TY!
On-Page Optimization | | HashtagHustler1 -
URL keyword separator best practice
Hello. Wanted to reach out see what the consensus is re-keyword separators So just taken on a new client and all their urls are structured like /buybbqpacks rather than buy-bbq-packs - my understanding is that it comes down to readability, which influences click through, rather than search impact on the keyword. So we usually advise on a hyphen, but the guy's going to have to change ALLOT of pages & setup redirects to change it all wasn't sure if it was worth it? Thanks! Stu
On-Page Optimization | | bloomletsgrow0 -
Are the prepositions and separate letters in URL bad for website optimization?
Is it ok for website optimization to use prepositions and separate letters in URL ? Examples: -i-series ; -salad-with-avocado etc.
On-Page Optimization | | adrecom0 -
Changing the url of a page
Hello. I would like to change the url of a page. It currently has very few inbound links. I would set up a 301 redirect to the new url. Is there anything else I should take into account before changing the url? Is there a downside to changing a url? Do inbound links carry the same value when a 301 redirect is involved? Thank you!
On-Page Optimization | | nyc-seo0 -
Can bad text URLs hurt pages?
If you have some pages that contain plain text URLs (not anchored links) that used to be good URLs, but are now bad, either because the website shut down or because it has been acquired by someone else and is now parked (or worse) - are those URLs enough to cause quality problems? For example: This information was brought to you by Waymaker http://www.waymaker.net These aren't the only ones. And yes, I know I should fix them, but there are probably 10,000 pages like it. I will fix them, but its not something I can do in a few minutes. (this one is easy to fix programmatically, but others are a lot more complex) So my question is: do you have actual experience that these are bad enough to cause ranking problems (making them low quality)
On-Page Optimization | | loopyal0 -
Can you optimize for 2 keywords per URL?
Or should you just stick to 1 page, 1 keyword all the time? If you do 2, are there any things you should watch out for? Thanks
On-Page Optimization | | inhouseninja0