Sunday, 29 April 2012

Refining Against Reflections?

From: Jacob Keller
Date: 19 March 2012 14:46


Dear Crystallographers,

it occurred to me that most datasets, at least certainly since the advent of synchrotrons, have probably some degree of radiation damage, if not some huge degree thereof. Therefore, I was thinking an exposure-dependent parameter might be introduced into the atomic models, as an exposure-dependent occupancy of sorts. However, this would require refinement programs to use individual observations as data rather than combined reflections, effectively integrating scaling into refinement. Is there any talk of doing this? I think the hardware could reasonably handle this now?

And, besides the question of radiation damage, isn't it perhaps reasonable to integrate scaling into refinement now anyway, since the constraints of hardware are so much lower?

Jacob



----------
From: Bernhard Rupp (Hofkristallrat a.D.)


As you observe, radiation damage is local, but the effect is - to different extent - on all Fs i.e. global (all atoms and their damage contribute to each hkl).

So one would need additional local parameters (reducing N/P) if you want to address it as such, your use of occupancy is an example (even if you have a reflection-specific

decay, somehow a realistic underlying atomic model would be desirable, and just changing occ might not be ideal)….So is the question then 'Could a reflection-specific

time dependent decay factor translate into any useful atom-specific model parameter?

 

BR

 



----------
From: Jacob Keller


I was thinking actually the dose-dependent-occupancy would really be a tau in an exponential decay function for each atom, and they could be fitted by how well they account for the changes in intensities (these should actually not always be decreases, which is the problem for correcting radiation damage at the scaling stage without iterating with models/refinement). I guess accurate typical values would be needed to start with, similar to the routinely-used geometry parameters. Actually, perhaps it would just be better to assume book values initially at least, and then fit the dose rate, since this is probably not known so accurately, then refine the individual tau's, especially for heavy atoms.

This of course would also have great implications for the ability to phase using radiation damage to heavy atoms (RIP)--there would have to be something like a Patterson map mixed somehow with the exponentials, which would show sites with the shortest half-lives.

JPK

----------
From: Nicholas M Glykos


Hi Jacob,

> Therefore, I was thinking an exposure-dependent parameter might be
> introduced into the atomic models, as an exposure-dependent occupancy of
> sorts. However, this would require refinement programs to use individual
> observations as data rather than combined reflections, effectively
> integrating scaling into refinement.

It seems to me that this approach would only be valid if the atomic (pdb)
model is a valid representation of the crystal structure irrespectively of
how much radiation damage has suffered. To put it differently, the
suggested approach would only be valid if the data collected remain
strictly isomorphous for the length of the experiment (within a scale and
overall B factor). But if the data are indeed isomorphous throughout the
data collection procedure, then the current treatment (which -through
scaling- essentially interpolates to zero radiation damage) would be
equivalent to your suggested procedure. If on the other hand, radiation
damage causes non-isomorphism (but you still deposit one atomic model),
you would be absorbing unknown model errors in yet another set of
adjustable parameters.

My twocents,
Nicholas




No comments:

Post a Comment