Do you have a question? Post it now! No Registration Necessary. Now with pictures!
- E.T. Grey
February 4, 2006, 2:03 am
rate this thread
I have an interesting problem. I have a (LARGE) set of historical data
that I want to keep on a central server, as several separate files. I
want a client process to be able to request the data in a specific file
by specifying the file name, start date/time and end date/time.
The files are in binary format, to conserve space on the server (as well
as to increase processing time). The data in each file can be quite
large, covering several years of data. New data will be appended to
these files each day, by a (PHP) script. The server machine is likely to
be a Unix machine, whereas my clients will be running on windows
machines. My clients program is written in C++.
My main problems/questions are as follows:
1). Transfer method issue:
What is the best (i.e. most efficient and fast way) to transfer data
from the server to clients ?. I think SOAP is likely to be too slow,
because of the sheer size of the data
2). Cross platform issue:
How can I insure that that the (binary?) data sent from the Unix server
can be correctly interpreted at the client side?
2). Security issue:
How can I prevent clients from directly accessing the files (to prevent
malicious or accidental corruption of the data files.?
- » Data transfer problem - ideas/solutions wanted (please)
- — Previous thread in » PHP Scripting Forum
- » ssh on command line: force using a group size (prime size) of 1024 (and no...
- — The site's Newest Thread. Posted in » Secure Shell Forum