|
Silvana
Westbury
,
Patrick
Fuhrmann
,
Juliane
Marauska
,
Brian
Matthews
,
Alun
Ashton
,
Abigail
Mcbirnie
,
Carlo
Minotti
,
Anton
Barty
,
Majid
Ounsy
,
Ana
Valcarcel-Orti
,
Uwe
Konrad
,
Kat
Roarty
,
Paul
Millar
Open Access
|
Apr 2023
|
|
|
Ed
Daniel
,
Mirko M.
Maksimainen
,
Neil
Smith
,
Ville
Ratas
,
Ekaterina
Biterova
,
Sudarshan N.
Murthy
,
M. Tanvir
Rahman
,
Tiila-Riikka
Kiema
,
Shruthi
Sridhar
,
Gabriele
Cordara
,
Subhadra
Dalwani
,
Rajaram
Venkatesan
,
Jaime
Prilusky
,
Orly
Dym
,
Lari
Lehtio
,
M. Kristian
Koski
,
Alun W.
Ashton
,
Joel L.
Sussman
,
Rikkert K.
Wierenga
Open Access
Abstract: The web-based IceBear software is a versatile tool to monitor the results of crystallization experiments and is designed to facilitate supervisor and student communications. It also records and tracks all relevant information from crystallization setup to PDB deposition in protein crystallography projects. Fully automated data collection is now possible at several synchrotrons, which means that the number of samples tested at the synchrotron is currently increasing rapidly. Therefore, the protein crystallography research communities at the University of Oulu, Weizmann Institute of Science and Diamond Light Source have joined forces to automate the uploading of sample metadata to the synchrotron. In IceBear, each crystal selected for data collection is given a unique sample name and a crystal page is generated. Subsequently, the metadata required for data collection are uploaded directly to the ISPyB synchrotron database by a shipment module, and for each sample a link to the relevant ISPyB page is stored. IceBear allows notes to be made for each sample during cryocooling treatment and during data collection, as well as in later steps of the structure determination. Protocols are also available to aid the recycling of pins, pucks and dewars when the dewar returns from the synchrotron. The IceBear database is organized around projects, and project members can easily access the crystallization and diffraction metadata for each sample, as well as any additional information that has been provided via the notes. The crystal page for each sample connects the crystallization, diffraction and structural information by providing links to the IceBear drop-viewer page and to the ISPyB data-collection page, as well as to the structure deposited in the Protein Data Bank.
|
Feb 2021
|
|
Data acquisition
Detectors
|
Open Access
Abstract: The Diamond Light Source data analysis infrastructure, Zocalo, is built on a messaging framework. Analysis tasks are processed by a scalable pool of workers running on cluster nodes. Results can be written to a common file system, sent to another worker for further downstream processing and/or streamed to a LIMS. Zocalo allows increased parallelization of computationally expensive tasks and makes the use of computational resources more efficient. The infrastructure is low-latency, fault-tolerant, and allows for highly dynamic data processing. Moving away from static workflows expressed in shell scripts we can easily re-trigger processing tasks in the event that an issue is found. It allows users to re-run tasks with additional input and ensures that automatically and manually triggered processing results are treated equally. Zocalo was originally conceived to cope with the additional demand on infrastructure by the introduction of Eiger detectors with up to 18 Mpixels and running at up to 560 Hz framerate on single crystal diffraction beamlines. We are now adapting Zocalo to manage processing tasks for ptychography, tomography, cryo-EM, and serial crystallography workloads.
|
Oct 2019
|
|
I03-Macromolecular Crystallography
I24-Microfocus Macromolecular Crystallography
|
Graeme
Winter
,
Richard J.
Gildea
,
Neil G.
Paterson
,
John
Beale
,
Markus
Gerstel
,
Danny
Axford
,
Melanie
Vollmar
,
Katherine E.
Mcauley
,
Robin L.
Owen
,
Ralf
Flaig
,
Alun W.
Ashton
,
David
Hall
Open Access
Abstract: Strategies for collecting X-ray diffraction data have evolved alongside beamline hardware and detector developments. The traditional approaches for diffraction data collection have emphasised collecting data from noisy integrating detectors (i.e. film, image plates and CCD detectors). With fast pixel array detectors on stable beamlines, the limiting factor becomes the sample lifetime, and the question becomes one of how to expend the photons that your sample can diffract, i.e. as a smaller number of stronger measurements or a larger number of weaker data. This parameter space is explored via experiment and synthetic data treatment and advice is derived on how best to use the equipment on a modern beamline. Suggestions are also made on how to acquire data in a conservative manner if very little is known about the sample lifetime.
|
Mar 2019
|
|
Krios I-Titan Krios I at Diamond
|
J.
Gómez-Blanco
,
J. M.
De La Rosa-Trevín
,
R.
Marabini
,
L.
Del Cano
,
A.
Jiménez
,
M.
Martínez
,
R.
Melero
,
T.
Majtner
,
D.
Maluenda
,
J.
Mota
,
Y.
Rancel
,
E.
Ramírez-Aportela
,
J. I.
Vilas
,
M.
Carroni
,
S.
Fleischmann
,
E.
Lindahl
,
A. W.
Ashton
,
M.
Basham
,
D. K.
Clare
,
K.
Savage
,
C. A.
Siebert
,
G. G.
Sharov
,
C. O. S.
Sorzano
,
P.
Conesa
,
J. M.
Carazo
Open Access
Abstract: Three dimensional electron microscopy is becoming a very data-intensive field in which vast amounts of experimental images are acquired at high speed. To manage such large-scale projects, we had previously developed a modular workflow system called Scipion (de la Rosa-Trevín et al., 2016). We present here a major extension of Scipion that allows processing of EM images while the data is being acquired. This approach helps to detect problems at early stages, saves computing time and provides users with a detailed evaluation of the data quality before the acquisition is finished. At present, Scipion has been deployed and is in production mode in seven Cryo-EM facilities throughout the world.
|
Oct 2018
|
|
I13-2-Diamond Manchester Imaging
|
Abstract: Ill-posed image recovery requires regularisation to ensure stability. The presented open-source regularisation toolkit consists of state-of-the-art variational algorithms which can be embedded in a plug-and-play fashion into the general framework of proximal splitting methods. The packaged regularisers aim to satisfy various prior expectations of the investigated objects, e.g., their structural characteristics, smooth or non-smooth surface morphology. The flexibility of the toolkit helps with the design of more advanced model-based iterative reconstruction methods for different imaging modalities while operating with simpler building blocks. The toolkit is written for CPU and GPU architectures and wrapped for Python/MATLAB. We demonstrate the functionality of the toolkit in application to Positron Emission Tomography (PET) and X-ray synchrotron computed tomography (CT).
|
May 2018
|
|
|
Jonathan M.
Grimes
,
David R.
Hall
,
Alun W.
Ashton
,
Gwyndaf
Evans
,
Robin L.
Owen
,
Armin
Wagner
,
Katherine E.
Mcauley
,
Frank
Von Delft
,
Allen M.
Orville
,
Thomas
Sorensen
,
Martin A.
Walsh
,
Helen
Ginn
,
David I.
Stuart
Open Access
Abstract: Macromolecular crystallography (MX) has been a motor for biology for over half a century and this continues apace. A series of revolutions, including the production of recombinant proteins and cryo-crystallography, have meant that MX has repeatedly reinvented itself to dramatically increase its reach. Over the last 30 years synchrotron radiation has nucleated a succession of advances, ranging from detectors to optics and automation. These advances, in turn, open up opportunities. For instance, a further order of magnitude could perhaps be gained in signal to noise for general synchrotron experiments. In addition, X-ray free-electron lasers offer to capture fragments of reciprocal space without radiation damage, and open up the subpicosecond regime of protein dynamics and activity. But electrons have recently stolen the limelight: so is X-ray crystallography in rude health, or will imaging methods, especially single-particle electron microscopy, render it obsolete for the most interesting biology, whilst electron diffraction enables structure determination from even the smallest crystals? We will lay out some information to help you decide.
|
Feb 2018
|
|
|
Bart
Alewijnse
,
Alun W.
Ashton
,
Melissa G.
Chambers
,
Songye
Chen
,
Anchi
Cheng
,
Mark
Ebrahim
,
Edward T.
Eng
,
Wim J. H.
Hagen
,
Abraham J.
Koster
,
Claudia S.
López
,
Natalya
Lukoyanova
,
Joaquin
Ortega
,
Ludovic
Renault
,
Steve
Reyntjens
,
William J.
Rice
,
Giovanna
Scapin
,
Raymond
Schrijver
,
Alistair
Siebert
,
Scott M.
Stagg
,
Valerie
Grum-Tokars
,
Elizabeth R.
Wright
,
Shenping
Wu
,
Zhiheng
Yu
,
Z. Hong
Zhou
,
Bridget
Carragher
,
Clinton S.
Potter
Abstract: This paper provides an overview of the discussion and presentations from the Workshop on the Management of Large CryoEM Facilities held at the New York Structural Biology Center, New York, NY on February 6–7, 2017. A major objective of the workshop was to discuss best practices for managing cryoEM facilities. The discussions were largely focused on supporting single-particle methods for cryoEM and topics included: user access, assessing projects, workflow, sample handling, microscopy, data management and processing, and user training.
|
Aug 2017
|
|
Data acquisition
|
Abstract: In any experimental discipline, raw data represents the source from which all discoveries are derived. A more strict
interpretation in X-ray diffraction experiments may refer to this as primary data since any pixel counts will have been
manipulated (e.g. analogue to digital conversion, dark current correction, interpolation of pixels etc.); however the
fundamental idea remains: this is the closest it is possible to get to the original experimental measurements.
At Diamond Light Source, the principle of secure recording and storage of the primary data was embedded in the data
acquisition system from the outset. The general user does not have permission to alter or delete the raw experimental data,
and the acquisition system GDA is designed to prevent over-writing of the images.
|
Aug 2017
|
|
B24-Cryo Soft X-ray Tomography
I13-2-Diamond Manchester Imaging
|
Open Access
Abstract: Segmentation is the process of isolating specific regions or objects within an imaged volume, so that further study can be undertaken on these areas of interest. When considering the analysis of complex biological systems, the segmentation of three-dimensional image data is a time consuming and labor intensive step. With the increased availability of many imaging modalities and with automated data collection schemes, this poses an increased challenge for the modern experimental biologist to move from data to knowledge. This publication describes the use of SuRVoS Workbench, a program designed to address these issues by providing methods to semi-automatically segment complex biological volumetric data. Three datasets of differing magnification and imaging modalities are presented here, each highlighting different strategies of segmenting with SuRVoS. Phase contrast X-ray tomography (microCT) of the fruiting body of a plant is used to demonstrate segmentation using model training, cryo electron tomography (cryoET) of human platelets is used to demonstrate segmentation using super- and megavoxels, and cryo soft X-ray tomography (cryoSXT) of a mammalian cell line is used to demonstrate the label splitting tools. Strategies and parameters for each datatype are also presented. By blending a selection of semi-automatic processes into a single interactive tool, SuRVoS provides several benefits. Overall time to segment volumetric data is reduced by a factor of five when compared to manual segmentation, a mainstay in many image processing fields. This is a significant savings when full manual segmentation can take weeks of effort. Additionally, subjectivity is addressed through the use of computationally identified boundaries, and splitting complex collections of objects by their calculated properties rather than on a case-by-case basis.
|
Aug 2017
|
|