What is URL Enforce Writer & How it can be write.
-
Hi, What is URL enforce writer to write existing web page URL's. Currently a website pages having underscore in it, I would like to use hyphen (-) in between the words.
Here is URL: http://www.cleanitsupply.com/t-Janitorial_Supplies_New_York_City.aspx
Please suggest me how I can use URL enforce write to re-write URL's without 301.
Your quick answers will be appreciated.
Note: This page having back external backlinks.
Thanks
-
Hi Keri, I learned about URL enforce writer, its nothing but redirecting the pages not at code level but at server side.
-
Hi! I can't say that I've heard of the "URL Enforce Writer" before, and a quick search in Google shows your question as the first result. Is this something that is an option in a content management system perhaps? And why do you not want to use a 301 redirect?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Marketing URL
Hi, I need a bit of advice on marketing URL's. The destinations URL is http://www.website.com/by-development.php?area=Isle Of Wight&development=developmentname. If we wanted to use www.website.com/developmentname on literature to send people to the ugly URL above, what would we do? Would we need to rewrite the ugly URL to the neat and then 301 the ugly to the neat? Currently, the team are using a new domain of neatandrelevant.info and 301 redirecting it to ugly URL but there are lots of different developments they want to send people to so a new domain is bought for each development which seems a bit unnecessary. They point to different pages on the ugly URL website. Assuming canonical tag would not be needed then because the ugly URL page would be redirected. Also, as the website has ugly URL's anyway, would it not be best practice to use rewrites anyway so that the URL's read www.mywebsite.com/region/development? Would it confuse things to then have extra short marketing URL's missing out /region? Hope that makes sense....
Technical SEO | | Houses0 -
Seomoz Can not Crawl My Site
Hello there Seomoz can not crawl my site. It's been 3 days now not a single page has been crawled. I deleted the campaign and tried again still now crawl not a single page.. Any solutions??
Technical SEO | | ExpertSolutions0 -
G+ and Authorship & Publisher
Hi Ive got one client for whom i have connected their G+ personal page to their site via the email process of setting up authorship. I also set up their company page on G+ and want to link it to the site too but its saying site is already verified/linked. I know i havn't added any rel=pub code to the site so dont know how this can be unless of course its using the already established author details (since admin for the co page) to make the company page connection. Is it the case that you now don't need to add the rel=pub code to establish publisher/verify link with your website ? Similarly to no longer needing to add rel=auth to site to establish authorship (since that can now be established via email) ? Any clarity here appreciated ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
I have altered a url as it was too long. Do I need to do a 301 redirect for the old url?
Crawl diagnostics has shown a url that is too long on one of our sites. I have altered it to make it shorter. Do I now need to do a 301 redirect from the old url? I have altered a url previously and the old url now goes to the home page - can't understand why. Anyone know what is best practice here? Thanks
Technical SEO | | kingwheelie0 -
Subdomains & CDNs
I've set up a CDN to speed up my domain. I've set up a CNAME to map the subdomain cdn.example.com to the URL where the CDN hosts my static content (images, CSS and JS files, and PDFs). www.example.com and cdn.example.com are now two different IP addresses. Internal links to my PDF files (white papers and articles) used to be www.example.com/downloads but now they are cdn.example.com/downloads The same PDF files can be accessed at both the www and the cdn. subdomain. Thus, external links to the www version will continue to work. Question 1: Should I set up 301 redirects in .htaccess such as: Redirect permanent /downloads/filename.pdf http://cdn.example.com/downloads/filename.pdf Question 2: Do I need to do anything else in my .htaccess file (or anywhere else) to ensure that any SEO benefit provided by the PDF files remains associated with my domain? Question 3: Am I better off keeping my PDF files on the www side and off of the CDN? Thanks, Akira
Technical SEO | | ahirai0 -
Javascript --can SE crawl?
I have a couple of nested div's. I'd like to do an onclick="location.href='http://www.example.com';" - within the outermost div so that all content within will link to one url. Can the Search Engines crawl this? Thanks!
Technical SEO | | Morris770 -
Correct Way to Write Meta
OK so this is a really, really basic question. However, I'm seeing some meta written differently to normal and I'm wondering if a) this is correct and b) whether there is any benefit. Normally it's like this: However, I am seeing it written like this is some places: So, the content= and name= are swapped around. I assume the people that did this were thinking that bringing the content forward would mean that Google reads keywords first. Just wondering if anybody knows whether this is good practice or not? Just spiked my interest so apologies for the basic nature of the question!
Technical SEO | | RiceMedia0 -
Ajax #! URLs, Linking & Meta Refresh
Hi, We recently underwent a platform change and unfortunately our updated ecom site was coded using java script. The top navigation is uncrawlable, the pertinent product copy is undetectable and duplicated throughout the code, etc - it needs a lot of work to make it (even somewhat) seo-friendly. We're in the process of implementing ajax #! to our site and I've been tasked with creating a document of items that I will test to see if this solution will help our rankings, indexing, etc (on Google, I've read the issues w/ Bing). I have 2 questions: 1. Do I need to notify our content team who works on our linking strategy about the new urls? Would we use the #! url (for seo) or would we continue to use the clean url (without the #!) for inbound links? 2. When our site transferred over, we used meta refresh on all of the pages instead of 301s for some reason. Instead of going to a clean url, our meta refresh says this: . Would I update it to have the #! in the url? Should I try and clean up the meta refresh so it goes to an actual www. url and not this browsererrorview page? Or just push for the 301? I have read a ton of articles, including GWT docs, but I can't seem to find any solid information on these specific questions so any help I can get would be greatly appreciated. Thanks!
Technical SEO | | Improvements0