Sunday 19 February 2012

Problem with getting Rfree and Rf down

From: Sam Arnosti
Date: 23 January 2012 21:48


Hi every one

I have some crystals in the space group P3121. I collect 180 frames of data.

My crystals do not diffract better than at most 2.0 angstrom, but the Rf barely goes below 23%,

and Rfree also remains somewhere between 28-33%. I have tried to refine my data as much as I can.

I do not know whether the problem is because of the bad diffraction or collecting extra frames.

The structure factors are also high but they get better as the crystals diffract better.

Thanks

Sam

----------
From: Ed Pozharski


These R-values are reasonable:

http://xray.bmc.uu.se/gerard/supmat/rfree2000/plotter.html
--
After much deep and profound brain things inside my head,
I have decided to thank you for bringing peace to our home.
                                   Julian, King of Lemurs

----------
From: Yuri Pompeu


Hi Sam,
some obvious questions:
1-Space group right? ( i´d say so from your R values...)
2-Is your data good throughout all of the 180 frames? whats Rsym if take only 100?
3-how good/complete is model? Missing parts, residues, base pairs??
4-evaluate your refinement strategy...
HTH

----------
From: David Schuller


Trigonal space groups have a choice of two arbitrary "settings."

Were all those 180 frames collected in one pass? What were the integration statistics?

What was the phasing method? Did you have a pre-existing structure*, molecular replacement, anomalous, or what?

* This is the one I am concerned about. If you had a pre-existing structure in this same crystal form, but then used some newly collected data, it may not be in the same setting as the original.
--
=======================================================================
All Things Serve the Beam
=======================================================================
                              David J. Schuller
                              modern man in a post-modern world
                              MacCHESS, Cornell University
                              schuller@cornell.edu

----------
From: Greg Costakes


It would seem that you have a large model bias. Rule of thumb is to keep R/Rfree within 5% of each other. If you find that the numbers are separating during refinement you need to reduce the weighting factor (dont use automatic) during refinement. What is your overall redundancy? Higher redundancies (>7 or so) do tend to increase overall R/Rfree. Dont worry so much about getting the R-factor down as supposed to keeping it close to the Rfree.

-------------------------------------------------------------------------------
Greg Costakes




----------
From: Ian Tickle


Reducing the AUTO weighting factor will have exactly the same effect
as reducing the MATRIX factor: if you examine the Refmac code you'll
see they both end up modifying the reflection weights.  I find the
default value of 10 for the AUTO factor much too high once the
structure has been completely built and refinement is progressing.  If
a structure is already refined (e.g. protein-ligand work) I tend to
start at WEIGHT AUTO 4 or even 2.5 and work down towards WA=1.  This
is the theoretically correct value (which is why I prefer to have the
weight in units of WA instead of the completely arbitrary 'MATRIX'
units), but usually it ends up being a little higher than 1 presumably
because of some residual model errors.

Cheers

-- Ian

----------
From: Dale Tronrud


  Is this observation about redundancies a general rule that I missed?
It seems rather surprising to me.  What have results have others seen?

Dale Tronrud


----------
From: Greg Costakes 



Whoops, I misspoke... I meant Rsym and Rmerge increase with higher redundancies. 


----------
From: Miguel Ortiz Lombardia 



El 24/01/12 18:56, Greg Costakes escribió:
But then suppose that one merges data from a crystal that is degrading
while exposed, sp the data gets degraded. This is not at all unusual. In
the absence of a deep understanding of refinement, intuition suggests
that degraded data should produce degraded models. If Rwork and Rfree
are measuring anything useful they should go up redundancy in those
not-so-unusual cases. Or intuition is misguiding me again.



----------
From: Dale Tronrud


  Yes, if one has a poorer quality data set one expects the Rw and Rf to
be higher, but this is not necessarily a correlation to high redundancy.
Surely if you have high redundancy and know the crystal is decaying you
have to flexibility to not use the decayed data in the merge.  I would
expect that decayed data would only be merged with the early data if
the redundancy was so low that you had to just to get a full data set.

Dale Tronrud

----------
From: Miguel Ortiz Lombardía


Le 24/01/12 21:18, Dale Tronrud a écrit :
I agree. I would also expect so... unless the user simply runs the data
reduction software and does not check the log files to see, among other
important issues, at what point the data starts degrading due to a
decaying crystal. If the software is clever enough to decide by itself,
it will be all right or sort of, which is, I suppose, a good point for
automation. Unfortunately, there are many users of black boxes, which
is, I presume, a danger of automation. My answer was kind of a caveat
for such type of users.


--
Miguel

Architecture et Fonction des Macromolécules Biologiques (UMR7257)
CNRS, Aix-Marseille Université

----------
From: Eleanor Dodson


It is a bit of a mystery to me why two structures of supposedly similar resolution with equally acceptable maps can give very different r factors - one sticks in the low 20s and another gives a smug 17% ..

I guess one could go back and analyse the model against the data and time..
Eleanor

----------
From: James Holton


Merging in radiation damaged data can indeed raise R/Rfree because the structure factors no longer correspond to the native structure.  Rather, they are an intensity-average of the native and damaged structures, and that can be hard to fit to a coordinate model!   How much damage is too much?  I'd say its when the change in the data or "error due to damage" becomes comparable with the lowest error you could hope to get when fitting a model to the native data: ~20-30% (R/Rfree).  This generally happens after about 20-30 MGy (Banumathi et al. 2006; Owen et al, 2006; Kmetko et al. 2006).

However, "redundancy" and "radiation damage" are not the same thing.  Contrary to popular belief, it IS possible to take many many exposures from the same crystal without doing any more damage than the usual ~100 exposures.  How?  What manner of trickery is this?  Simple!  You use a shorter exposure time.

Personally, I always think about "redundancy" or "multiplicity" in the context of a fixed crystal "lifedose" (http://dx.doi.org/10.1107/S0909049509004361).  That is, you only get so many seconds of shutter-open time before the crystal is dead.  So, to me, "strategy" is nothing more than deciding how to divide up those shutter-open seconds, and the only way to increase redundancy/multiplicity is to shorten the exposure time.  Which, by the way, is almost always a good idea.

-James Holton
MAD Scientist


No comments:

Post a Comment