Hey there
To Quote Google on this, with the issue of ASCII and UTF encoded characters, like Arabic:
"Yes, we can generally keep up with UTF-8 encoded URLs and we’ll generally show them to users in our search results (but link to your server with the URLs properly escaped). I would recommend that you also use escaped URLs in your links, to make sure that your site is compatible with older browsers that don’t understand straight UTF-8 URLs"
So their recommendation would be to have both URLs available (the English and the Arabic) in order to support all users. So the fact you already do this is a good thing.
The next step would be to make sure you are handling duplicate content correctly. If the Arabic and non Arabic URLs are linking to a page with the same content - Google _should_be able to recognise this as the same page and not penalise you for duplications. So if the Arabic URL and the "escaped URL" (ASCII/English equivalent) both go to the same page, you should be fine. I've experienced this quite a few times with Turkish websites, for example, that also have UTF encoded characters.
However, you can eliminate the risk further by adding a canonical tag to each page. As far as I am aware, the canonical tag will support Arabic characters and so, on each page of the site, add a canonical tag that points to that page. For example, with the URL above, you would want to place a canonical tag like:
You can read more on canonical tags here: http://moz.com/learn/seo/canonicalization
Do be aware that for XML sitemaps, the URLs in the sitemap need to be URL-escaped - that is to say, UTF encoded URLs need to be made into their ASCII equivalent. You can read more about that in this Google guide to using non-alphanumeric characters in Sitemap URLs.
Hope this helps.