FIle Names
-
HI Guys,
Would it make a difference if I named a URL
2014-ford-fiesta.html
or
2014+ford+fiesta.html
Thanks!
-
+'s in urls simply represent a space. Dashes are also used as a space. So, there would be no impact. You can find thousands of sites doing well in the SERPS using +'s.
That being said, if I was building it from scratch I'd use dashes. Just easier to read for users, and there is a small possibility that it could trip up the search engines. Super minor unrealistically small chance, but I put my tin foil hat on the same way many SEOs do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My website is my name. Overnight it went from being the number one google search to not showing up at all when you google my name. Why would this happen?
I built my website via square space. It is my name. If you google my name it was the number one hit. Suddenly 2 weeks ago it doesn't show up AT ALL. I went through square spaces SEO check list, secured my site etc. Still doesn't show up. Why would this happen all of the sudden and What can I do? Thank you!
Intermediate & Advanced SEO | | Jbark0 -
How does linkedin get grey microdata when searching a persons name?
If you google any persons name who has a linkedin profile and then locate that entry in the search engine results (linkedin profiles are usually first page for most people) you will see that they get microdata indexed which is basically the persons location and headline from their profile. Looking at their markup, i see location which makes sense as it is an hcard format, but I do not see any microformat data around the headline. Any ideas how they get this? wDQcGZY
Intermediate & Advanced SEO | | stacks210 -
Bing Disavow file
Hi I have just set up Bing Webmaster tools, and wanted to submit my disavow file. However I can only work out how to add one link at a time, does anyone know how to add a csv file. Thanks in advance. Andy
Intermediate & Advanced SEO | | Andy-Halliday0 -
Please share best practices for subfolders and paths in a domain name
I am seeking feedback on the best way to proceed with regards to a project I am working on. Say for example the domain was domain.com and this site wanted to target specific markets such as realtors, attorneys, churches, and restaurants. Which URL structure would be better? domain.com/industries/attorneys or domain.com/attorneys Can I get your feedback along with any supporting articles. This is for a large ecommerce site but these particular pages are solely going to be used for marketing purposes to bring those site visitors to the website to let them know we understand their needs. Thanks for your help> Malcom
Intermediate & Advanced SEO | | PrintPlace.com0 -
How to leverage browser cache a specific file
Hello all,
Intermediate & Advanced SEO | | asbchris
I am trying to figure out how to add leverage browser caching to these items. http://maps.googleapis.com/maps/api/js?v=3.exp&sensor=false&language=en http://ajax.googleapis.com/ajax/libs/webfont/1/webfont.js http://www.google-analytics.com/analytics.js Whats hard is I understand the purpose, but unlike a css file, how do you specify an expiration on an actual direct path file? Any help or link to get help is appreciated. Chris0 -
Files blocked in robot.txt and seo
I use joomla and I have blocked the following in my robots.txt is there anything that is bad for seo ? User-agent: * Disallow: /administrator/ Disallow: /cache/ Disallow: /components/ Disallow: /images/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /libraries/ Disallow: /media/ Disallow: /modules/ Disallow: /plugins/ Disallow: /templates/ Disallow: /tmp/ Disallow: /xmlrpc/ Disallow: /mailto:myemail@myemail.com/ Disallow: /javascript:void(0) Disallow: /.pdf
Intermediate & Advanced SEO | | seoanalytics0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Sitemaps. When compressed do you use the .gz file format or the (untidy looking, IMHO) .xml.gz format?
When submitting compressed sitemaps to Google I normally use the a file named sitemap.gz A customer is banging on that his web guy says that sitemap.xml.gz is a better format. Google spiders sitemap.gz just fine and in Webmaster Tools everything looks OK... Interested to know other SEOmoz Pro's preferences here and also to check I haven't made an error that is going to bite me in the ass soon! Over to you.
Intermediate & Advanced SEO | | NoisyLittleMonkey0