For those of you who did not follow the discussion on the technical aspects of a web interface for GNUmed here is a short summary. I have looked at many web interfaces and learned quite a bit about existing web frameworks. It turned out that none of the existing frameworks fit our needs. This is due to the fact that the web is designed to fire and forget while GNUmed wants a persisten connection to the database. Second aspect was that we wanted database access done by Postgresql itself rather then duplicating that inside a database (which seems to be industry standard).
Long story short. Luke Leighton from pyjamas came to help us and invested a substantial amount of time to get things rolling. A first working version was recently referenced in this article. All of the code that makes it work has recently been merged into the main GNUmed code repository.
If makes use of pyjamas, cjson and multitaskhttpd.
Here is how you can try it out and start improving it. This guide assumes you have GNUmed running already.
1.) Get and install lovely-jsonrpc
6.) compile the pyjamas application in the GNUmed source tree
8.) open a webbrowser and go to http://localhost:8080/ProxiedWeb/jsonrpc/output/JSONRPCExample.html
If you want to hack on it have a look at JSONRPCExample.py. It is a pyjamas application.
Enjoy.
Long story short. Luke Leighton from pyjamas came to help us and invested a substantial amount of time to get things rolling. A first working version was recently referenced in this article. All of the code that makes it work has recently been merged into the main GNUmed code repository.
If makes use of pyjamas, cjson and multitaskhttpd.
Here is how you can try it out and start improving it. This guide assumes you have GNUmed running already.
1.) Get and install lovely-jsonrpc
* wget http://lkcl.net/lovely-jsonrpc.tgz2.) get and install cjson
* unpack it
* cd lovely-jsonrpc
* python setup.py install
* e.g. python-cjson on Debian3.) get multitaskhttpd
* git clone git://pyjs.org/git/multitaskhttpd.git4.) get pyjamas
* cd multitaskhttpd
* python proxyapp.py &
* git clone git://pyjamas.git.sourceforge.net/gitroot/pyjamas/pyjamas5.) get GNUmed from git master as tgz or a git clone. Go to gitorious for instructions.
* cd pyjamas
* python bootstrap.py
* cd bin
* put pyjsbuild into the PATH or symlink into ProxiedWeb directory of GNUmed
6.) compile the pyjamas application in the GNUmed source tree
* cd ProxiedWeb7.) start GNUmed like this: ./gm-from-vcs.sh --ui=web
* run build.sh (make sure pyjsbuild can be be found on your system)
8.) open a webbrowser and go to http://localhost:8080/ProxiedWeb/jsonrpc/output/JSONRPCExample.html
If you want to hack on it have a look at JSONRPCExample.py. It is a pyjamas application.
Enjoy.
2 comments:
a few years forward (now) and I wonder if this really nice doc still valid. Do we still need multiTaskHttp ? If so where to get it from?
regards.
hi paul yes absolutely you still need multitaskhttp.py and it should be included with the source that i worked on at the time.
the reason why is this: as you are probably aware gnumed is designed around postgresql, and gnumed is one of the rare SQL-based applications that makes fine-grained usage of postgresql access control. tables, read and write access are locked to specific user accounts, so that patient data is protected properly.
this is *COMPLETELY* and *UTTERLY* different from a standard "web application", where you have ONE application (the server), which uses ONE username/password to connect to the database, and then that (one) application - the server - implements access control in its own way - entirely manually, in the server's source code.
also, one other major difference is that whilst the gnumed tk front-end maintains a persistent TCP/IP connection (per user), a web browser GOES AWAY after it has received its HTML response. the only identifying information available is to use session cookies (or other technique).
so the problem is that in a multi-process web server (which is perfectly normal), you have a user that comes along, logs in, and then the server *drops the sql connection*. when the web browser makes a page-refresh or the user causes it to load another page, the connection goes this time to a *completely different web server process*... how do you even _begin_ to share persistent SQL connections - let alone ones with different user access controls - across this kind of transient setup? answer: you don't.
so what i needed to write was an HTTP proxy which would maintain persistent connections (hence it had to be single-process) but could also handle multiple incoming as well as multiple outgoing requests.... again from a single process.
the only thing that i could find which could handle this type of extremely unusual requirement was: multitaskhttp.py.
i then modified the standard python http client library to be re-entrant so that it would work with multitaskhttp (to act as the proxy).
also i added in, transparently, a session cookie into the incoming connection so that it would be possible to use that to look up which persistent database connection - accessible in a python dictionary hence to remind you that this can ONLY be done in a single process - to use for that session.
in all it is a fantastically complicated design that quite literally CANNOT be made any simpler - period. this was why all other proposals from all other people who responded were rejected, because they did not understand or accept the complexity of what was needed.
the only other possibility that could be considered would be to totally redesign gnumed, removing the use of postgresql access control entirely and replacing it with hand-crafted access control mechanisms just as is done in any standard web server framework.
given that the design of gnumed's postgresql access control has been carefully audited over many many years to ensure that patient confidential data is not given out to people who are not authorised to see it, suggestions from web developers to remove the access control did not go down well.
so, yes it's still relevant, however there was still a heck of a lot that needed to be done (still needs to be done) - it was an experiment, it worked, but it was not followed up on.
Post a Comment