. Topicala Page Index Token

A Journal about the experiences I have developing little applications in C#, Perl, Html and Javascript and talking about things new things that I use. Always Geeky; Always Nerdy; Always poor Grammer!

I am a Software Analyst Developer working in Southport, England but living in Liverpool. I develop mainly in C# and ASP.Net. I have been developing comercial software for several years now. I maintain this site (hosted at SwitchMedia UK) as a way of exploring new technologies (such as AJAX) and just generally talking about techie geek issues. This site is developed through a host of Perl scripts and a liberal use of Javascript. I enjoy experimenting with new technologies and anything that I make I host here.

Quick Search

Web www.kinlan.co.uk

Sunday, September 04, 2005

The Failures of my first AJAX Application: Part 2

This is the second instalment of the "Failures of my First AJAX application" and is subtitled "It didn’t help reduce bandwidth".

One of the initial goals of the application was to access all the web services directly from the page without having to pass any queries through my own server. This would mean the only data that my server sent to the client would be the HTML and JavaScript for the page. All other requests would be handled by the client and would be directed straight to the third party web service. This would greatly reduce my bandwidth demands.

The diagram shows that my server should only return the response to the initial request. And the client will handle all the other requests by use of AJAX method (JavaScript and XML) directly with the required web services.

It turned out differently however. The actual implementation suffered from lack of foresight. As soon as I created what I thought was my ideal solution [see above], I ran across problems running the AJAX code in Firefox. I also then realised in a high security environment in IE 7 and Internet Explorer 6, data binding across data sources is disabled.

To get around this I had to create proxy scripts [here and here]on my server that the client page would call (because it is on my domain name IE and Firefox would allow this). All that the client scripts do is to pass a request that the client makes onto the correct web service. [see image below]

There is an added benefit to using a proxy script; you can hide any secret information that should not be available for the client to see, things such as the developer token that Technorati requires.

The major downside that I see and the point of this entry, is that I have to handle every client request to the desired web service rather than having the client manage the request. Thus increasing my bandwidth demands.

  • Hides security information needed in some web services.
  • Will allow the developer to monitor the requests that I a client would make.
  • Proxy scripts would allow you to merge requests and perform any kind of data manipulation on the script before it reaches the client.

  • All data is passed through the server, thus using extra bandwidth.
  • Requires proxy scripts to be created.
  • Proxy scripts may be insecure and also may take up too much server bandwidth

I fully intend to support in the next version systems that allow cross domain data sources, because it will greatly help my bandwidth demand situation. But there may be some situations that I need to perform multiple calls to web services in one single call to my proxy script.
Technorati Tags
[feed], [feed], [feed], [feed], [feed], [feed], [feed], [feed], [feed], [feed], [feed]

Comments: [Add New]