An important lesson learned about AJAX and accessibility
Yesterday I went to visit some fellow consultants at their assignment for a sub company/department of one of Sweden’s largest banks. We had a talk about AJAX in general and different ways of how to implement it, and one of them opened his web browser to navigate to some AJAX-based web sites.
Something interesting followed next that really baffled me. Most web sites he went to had empty white patches where no content showed up, and some web pages even went completely blank. We knew for sure that JavaScript was enabled in his web browser of choice (IE, but still almost a real web browser… ;-)) so that couldn’t be the problem.
Then, naturally, we had to go test my ASK script to see what was going on. The version that we got there was the fallback version that works without JavaScript, but instead with regular links reloading the entire web page, meaning that no JavaScript events were applied.
After some digging, we found out that the JavaScript file was completely blank! The reason for this, apparently, is that the proxy server they had to go through to access internet totally cleansed any JavaScript file that contained this text:
new ActiveXObject
So much for object detection and every other approach we recommend to web developers. Not a single line of code was left behind in the file. And the problem is that it won’t throw an error or show the content of a noscript
tag either; everything just stops working.
My initial reaction was that if they have such a tight security environment doing that, I really don’t want to care to cater to them. But as my boiling blood got calmer (kind of an exaggeration), I realized that this company was too big to ignore the fact that all their users got shut out.
Also, if they have a situation like this, it’s likely that many other large companies have a similar solution.
Conclusion: if you want to develop AJAX apps, make sure that it works without JavaScript as well, apply all the scripts in an unobtrusive fashion. I’m just glad that ASK passed the test with its accessible groundwork and then building AJAX functionality on top of that (Actually, the Google Analytics code of the ASK page did in fact throw an error when we tested it, but I think it was just a consequence of the proxy server doing it’s job…).
Hmm… this is an interesting issue indeed. It introduces a lot of problems on pure AJAX applications that for whatever reason don't allow a non-JS version. A good example is a BackBase powered pure AJAX web application.
I've seen those break horribly because of filtering proxies as well. Like you said, the noscript stuff won't work either because JS is actually enabled on the client.
I guess it may be worthwhile to look for a way to detect whether a filtering proxy is present or not, possibly by a piece of JS that won't get filtered out which detects the lack of certain other code….
Just thinking out loud here… 🙂
But what can you do in fact against such problems? I mean, you can't foresee how a client machine or network filters your Javascript. Therefore you'll have to wait and react when an error occurs.
Or did I just misunderstood?
Marco,
My thoughts have been similar to yours…
To have some kind of variable or so that's initially false, but it gets set to true in the JavaScript file that contains the AJAX functionality.
I guess it's good practice then to separate the <code>XMLHttpRequest</code> code into its own file, as not to ruin any other scripts in the web page.
Chris,
No, you understood me correctly, and it's a valid question.
My gut feeling is the approach mentioned above, to separate AJAX functionality into its own file, and to maybe also use varibles to set and check if they exist to make sure everything will work as intended.
Also, when possible, to have a proper fallback if the necessary JavaScript support isn't there.
Ah, the wonderful hell that is clueless or violent proxy filtering.
Fortunately I've been able to make my latest project with Progressive Enhancement… but doing this is very difficult. I'm finding it hard to see how an application could be implemented with unobtrusive AJAX and still be infinitely-scalable. Maybe I'm just not experienced enough to see it.
Interesting!
Therefore, to ensure that your site/application works, you must test all JS-capable browsers, older JS browsers (to verify graceful degradation), current but JS-disabled browsers, and current JS-enabled browsers with empty JS-files.
Or, don't bother with JS because it is just too complicated. 🙂
Faruk,
Yes, isn’t it great? 🙂
Montoya,
I guess it depends on the application. Not every application can offer a perfect matching fallback with JavaScript, but as far as possible, it’s a good goal to have.
Jules,
No, no, no. 🙂
JavaScript is amazing when used right (and in the absence of terrible proxy filters… :-)).
Oh that's cool, I just set up my proxy to filter all files including the text <code>javascript</code> 😉
Instead of using ActiveX to initiate the xmlhttprequest object for Microsoft, how about trying the proprietary xml element?
Having looked at this before, we've used a very (cheap) load notifier which was placed at the end of each externally loaded js file:
<code>
filename = "myfile.js";
if (typeof ld_scripts != 'object') ld_scripts = new Object();
ld_scripts[filename] = true;
</code>
Upon the included file being loaded, you can always check ld_scripts and make sure everything you need filewise has made it intact.
In our case, we were using it to ensure that all scripts loaded via document.write() had actually completed before we tried to reference objects and classes in those external files.
Jeena,
Ha ha. 🙂
Tanny,
I haven't tried it out, but I guess that might work. Not sure, though…
Jakob,
Yes, that's the kind of approach that needs to be done.
what about eval("new Active" + "X" + "Object(…)")?
This is rather old news yet topical; XML islands.
Dean Edwards has recently wrote something about this.
XML Islands
But with XML islands, it's not possible to utilize the POST-method…
/hbi
Markus,
Not sure, actually…
And unfortunately I don't have any good environment to test. I like the creative thinking, though! 🙂
It just might work!
Hakan,
Well, that might be an alternative fallback option, but , like you say, then we miss out on some functionality.
[…] ne of a ticker script automatically, while the user is trying to read other content? Nope. […]
[…] tells you if JavaScript is enabled or not in the web browser; not if it actually works. There are proxy servers out there cleaning out JavaScript files with what it thinks is inappropriate content, over-zealous antivirus programs, firewalls preventing […]
[…] mobile devices often have problems with JavaScript or with layers. Some proxies in large companies filter JavaScript for security reasons, so you can’t rely on the ubiquitous availability, your script has to be […]
[…] Company proxy servers filtering out code (for example, read An important lesson learned about AJAX and accessibility). […]
[…] Company proxy servers filtering out code (for example, read An important lesson learned about AJAX and accessibility). […]
[…] proxy d’entreprise qui filtrent le code (lire An important lesson learned about AJAX and accessibility à ce […]
[…] proxy d’entreprise qui filtrent le code (lire An important lesson learned about AJAX and accessibility à ce […]
[…] An Important Lesson Learned About AJAX and Accessibility – Robert Nyman […]