Wednesday, 9 December 2015

Why "browser" clients not always are a good idea

- or why you could/should consider a n-tier setup, spiced up with a briefcase model - part nil.

I haven't really been able to see the revelation in browser clients - and I am not quite sure why anyone would always consider this as a good option.

It might look tempting from an IT operations standpoint and maybe also from a management view - but would they have a chance to know - it is the users experience and productivity that counts - and that is what we all should care about.

So I would say that if the job the client needs to handle is comparable to an old VT100 dumb terminal - then browser it is.

I will do this in a multi-part series of blog posts - since there are numerous things I want to cover in an n-tier scenario - trying to utilize the various components in their best way - and also to try and make the operational side of things just as easy as if it had been a browser based solution.

I will also use this series to freshen up on my DataSnap skills - since that technology has improved very much in the last couple of versions - and there is a lot of older info out on the internet that isn't quite justifying some of the new stuff that is in there.

The posts will be a mixture of conceptual ideas, and working examples. I plan to cover the following topics:

- Rich clients
- DataSnap REST server(s) and JSONReflect.
- Utilizing backend DB servers - why you should not just treat it like a bucket of data.
- Don't let "flexibility" be an excuse for not optimizing your solution.
- Buying more and bigger hardware should never be first solution to a problem.
- Briefcase model - cached data why and when.
- Networking, failovers and balancing.

So even if these topics and the concept of things could be done in a variety of tools and technologies - I will be using Delphi 10 Seattle, FireDAC and DataSnap as the primary tools, because they are the tools I am the most comfortable with and I also think that they are very productive - compare to what I else have used, experienced others have used and what I see being delivered out there in the world. And at the end of the day - less is more.

So we will do an "Intelligent" client that is able to detect when it goes offline, and continue to work on it's cache data, as in a briefcase model - then when connecting again it will push the deltas of the updates back to the networked application server(s) and refresh its cache.

I will walk-through which data could be cache and what shouldn't - which then also might result in lesser functionality when offline - which can make perfectly sense - if you for instance would need to merge PDF data with your printout from a central repository, before you email the result to the customer. You would be able to preview and print your work, without networking, but sending the mail would require some, and then you would also have access to the online repositories anyway.

I will show the concept of how a client can be pushed and kept updated along with "static" data - without using things like dreadful one-click solutions or other software management solutions that would promote a browser based solution.

Do not be blind to the complexity and incompatibility in the browser/web server world - Java should have stayed an academic exercise - and if you want to ensure that anything you do runs in every known browser - you need to keep it to a minimum of functionality - that is regardless of language you choose - the browsers is your constraints.

A native compiler, like Delphi makes the OS your constraint - and even these can be extended. But I would love to hear any good arguments why a browser is a good client for normal to complex client tasks.

Well enough teaser and rants. I will during the weekend start the first part - and some code - promise :-)

It has been a very busy year - but then there is also a lot on my mind I want to blog about.