Decrease data usage with Blue hardware
#12
(2016-01-16, 09:16)kevinmcc Wrote: I was thinking QuickLZ, LZO, LZ4, LZ4-HC, or zlib for the the compression. My first concern would be how much data needs compressed and how much cpu power the Blitzortung receivers have. Would they be able to compress the data fast enough. The second concern is the the power needed to decompress at the server end.  I agree some data is not very compressible, some is very easily compressible. I am still waiting for System Blue to launch so I can participate. Having not seen the data myself, I can not judge. Compression may not even be a worth while option.

I see a significant issue with those choices. These compression algorithms only become efficient if you build up data in a buffer for a while. It'd be quite efficient if you would lets say build up data one hour at a time and then all send it in bulk, but that'd delay the detection significantly. While I presume the target is to minimize latency. You really want stream compression algorithms for this sort of job, and there are a few near-ideal cases but these are bad to use in practice for data you *must have* due to their total lack for redundancy. If you're willing to sacrifice a few detections you could probably get away with it though.
 

(2016-01-16, 12:51)Cutty Wrote: The majority of the data transmitted is 'junk' .  High gains, many useless Skywaves and Noise signals, local disturbers (which I am plagued with).  Undecided 
 The goal is to eliminate as much junk as possible.  Ideally, perhaps, the network would work with stations covering radius of <1000km radius, and not attempting to detect signals in other hemispheres.  This requires station density and optimizations.  Many stations send too much noise, including mine.  As more stations come on line, those such as I will be reducing our antennas and gain settings.  Some of those stations are quite clean, in a good environment, and do very well with longer range. Many do not.
There will be a push for smaller antennas and less 'distance' capability, quality of data, etc. Since that is NOT the way the system is envisioned.  These are not 'stand alone' systems, but must participate as a 'cell' in a network to be effective.
A local station should normally go interference, and quit transmitting with 'nearby' storms... the rest of the network picks up the data, for example.
The TOA / TOGA system considers the whole pulse train, the frequencies and 'respective energy'  contained in the impulse, and not just the discharge pulse timing at trigger, or triggering by 1st or 2nd skywave signal... therefore 'recreating' or 'interpolating'  those zero crossing iterations from a 'sampling' would likely result in 'distortion' of the quality control and stroke information. The system wants as much of the complete stroke, with real data, as clean as possible.

If the complete waveform is considered it'd indeed by tricky to decompose the signal into a series of orthogonal symbols unless you'd take the algorithm into account. Though I doubt anyone is insane enough to consider actually doing that one. I suppose more could be gained by introducing filtering to remove local noise and false signals in that case. Then again, a few gigabyte per day on a residential connection isn't really that much anymore in this day and era with 100+ Mbps readily available in many regions... Think it's more of an issue on the receiving server end really these days.
Reply


Messages In This Thread
RE: Decrease data usage with Blue hardware - by Bart - 2016-01-17, 10:21



Users browsing this thread: 1 Guest(s)