From: Frank von Delft
Date: 30 November 2011 19:48
(Old thread, just cleaning up, sorry...)
I thought James' algorithm didn't do anything to the spots, just to the stuff in between.
So one obvious way to handle this is for the data processing programs to be looking between the integrated spots as well, whether they're missing anything; they could then flag it up if they are, prompting the experimenter to haul out the originals.
I mean, data processing programs are looking between spots already, aren't th... oh. Right. Sorry, silly me. Well, maybe they should. Considering that's the reason we're pushing terabytes around cyberspace, according to this thread. Personally I'd already be happy with an indication of how many spots I'm not integrating.
phx.
On 08/11/2011 12:17, Herbert J. Bernstein wrote:
----------
From: John R Helliwell
Dear Frank,
Re "pushing terabytes around cyberspace".
Well, actually, the synchrotron facilities hosting the datasets
locally that were measured there is a major step forward for
diffraction data preservation, especially for MX but also true for
SAXS, XAFS etc, as is being pushed forward by Alun Ashton and
colleagues at DLS. The European SR and neutron Facilities 'PaN' data
archiving initiative relates also firmly to this.
For the major hurdle of chemical crystallography datasets, with the
majority measured on local X-ray sources, I think it is looking
promising that local University data depositories will host these. At
least, the discussions with specialists here at University of
Manchester are underway and it looks promising.
The compression approach also sounds promising too, as indicated in
detail, including with a wide range of
tests, by James Holton.
Greetings,
John
--
Professor John R Helliwell DSc
Date: 30 November 2011 19:48
(Old thread, just cleaning up, sorry...)
I thought James' algorithm didn't do anything to the spots, just to the stuff in between.
So one obvious way to handle this is for the data processing programs to be looking between the integrated spots as well, whether they're missing anything; they could then flag it up if they are, prompting the experimenter to haul out the originals.
I mean, data processing programs are looking between spots already, aren't th... oh. Right. Sorry, silly me. Well, maybe they should. Considering that's the reason we're pushing terabytes around cyberspace, according to this thread. Personally I'd already be happy with an indication of how many spots I'm not integrating.
phx.
On 08/11/2011 12:17, Herbert J. Bernstein wrote:
Um, but isn't Crystallograpy based on a series of
one-way computational processes:
photons -> images
images -> {struture factors, symmetry}
{structure factors, symmetry, chemistry} -> solution
{structure factors, symmetry, chemistry, solution}
-> refined solution
At each stage we tolerate a certain amount of noise
in "going backwards". Certainly it is desirable to
have the "original data" to be able to go forwards,
but until the arrival of pixel array detectors, we
were very far from having the true original data,
and even pixel array detectors don't capture every
single photon.
I am not recommending lossy compressed images as
a perfect replacement for lossless compressed images,
any more than I would recommend structure factors
are a replacement for images. It would be nice
if we all had large budgets, huge storage capacity
and high network speeds and if somebody would repeal
the speed of light and other physical constraints, so that
engineering compromises were never necessary, but as
James has noted, accepting such engineering compromises
has been of great value to our colleagues who work
with the massive image streams of the entertainment
industry. Without lossy compression, we would not
have the _higher_ image quality we now enjoy in the
less-than-perfectly-faithful HDTV world that has replaced
the highly faithful, but lower capacity, NTSC/PAL world.
Please, in this, let us not allow the perfect to be
the enemy of the good. James is proposing something
good.
Regards,
Herbert
=====================================================
Herbert J. Bernstein
Professor of Mathematics and Computer Science
On Tue, 8 Nov 2011, Harry Powell wrote:
Hi
I am not a fanI agree.
of one-way computational processes with unique data.
Thoughts anyone?
Cheerio,
Graeme
Harry
--
Dr Harry Powell, MRC Laboratory of Molecular Biology, MRC Centre, Hills Road, Cambridge, CB2 0QH
http://www.iucr.org/resources/commissions/crystallographic-computing/schools/mieres2011
----------
From: John R Helliwell
Dear Frank,
Re "pushing terabytes around cyberspace".
Well, actually, the synchrotron facilities hosting the datasets
locally that were measured there is a major step forward for
diffraction data preservation, especially for MX but also true for
SAXS, XAFS etc, as is being pushed forward by Alun Ashton and
colleagues at DLS. The European SR and neutron Facilities 'PaN' data
archiving initiative relates also firmly to this.
For the major hurdle of chemical crystallography datasets, with the
majority measured on local X-ray sources, I think it is looking
promising that local University data depositories will host these. At
least, the discussions with specialists here at University of
Manchester are underway and it looks promising.
The compression approach also sounds promising too, as indicated in
detail, including with a wide range of
tests, by James Holton.
Greetings,
John
--
Professor John R Helliwell DSc
No comments:
Post a Comment