IT and Scientific Computing

Our Aims

We aim to find and provide the computing tools and resources our scientists need to carry out their outstanding cancer research. For this, the facility operates a highly integrated data analysis platform, consisting of a High-Performance-Computing system, a Linux virtualisation platform, bare metal servers and cloud services.  The platform provides solutions for data storage, including automated data management tools, as well as data processing, analysis, publication, and archiving.

It allows the secure processing and analysis of sensitive high-throughput data. In addition, we offer application and software development support to enable our scientists to use the latest Bioinformatics methods and technologies for their research.

IT and SciCom Schematic

Close collaborations

The Institute operates a multitude of cutting edge instruments for conducting genomics, proteomic and imaging experiments. The large amounts of data produced by these instruments are stored on a 4PB storage system and can be analysed using an oVirt-based virtualisation platform or “Griffin”. Griffin is a heterogenous 1,500 core High Performance Compute Linux cluster, consisting of standard, high memory and GPU nodes, with a 1PB parallel file system. With its tightly integrated hardware and cloud infrastructure, SciCom operates Linux and Windows-based high throughput data analysis and management services.

We work in partnership with the Computational Biology Support Team – find out more about their work on their Facility page.

Sci-Com image, part of communications efforts

Virtual machines, practical support

Combining virtual machines hosted on powerful computing hardware with remote visualisation also allows interactive data processing of compute, data, and memory intensive workloads, which is especially helpful for processing proteomics data. Special data protection arrangements on Griffin allows the processing and analysis of access-controlled (e.g. dbGaP, ICGC, etc.) and clinical trial data.

Scientific Computing has a strong focus on automating the processing of data to increase throughput and accelerate and ease the burden of data analysis for the scientists.

Griffin

Griffin is a heterogeneous InfiniBand Linux cluster designed to analyse a large variety of different data types using different methods and algorithms. It consists of 100 x standard compute nodes, 2 x high memory nodes, and an NVIDIA Redstone GPU (graphic processing units) system and an FPGA. We decided to replace the former MOAB/Torque-based batch system with the modern and widely used SLURM for more effective job management. HPC nodes and storage are connected via a high-speed InfiniBand 100Gb/s connection, allowing high-speed data transfer between the components. Accessing the system is also much faster than before, as the network bandwidth has been upgraded from 10GbE to 25GbE.

With its massive computing power, Griffin is a crucial component of SciCom’s High Throughput Data Analysis platform. It is tightly integrated with various storage systems, the High-Performance virtualisation platform, bare metal servers and cloud services to provide an integrated platform for analysing research and clinical data. The platform covers the entire data analysis lifecycle, spanning from data generation, processing, downstream analysis, and visualisation, publication, and archiving, while following FAIR principles.

A note from the Team Leader – Marek Dynowski

I am dedicated to managing a top-tier IT and Scientific Computing Core Facility, ensuring high user satisfaction and equipping researchers with the essential tools to advance their work. My primary focus is integrating cybersecurity standards within IT and Scientific Computing, despite their differing priorities. Additionally, I emphasise utilising software development methodologies, such as CI/CD, to automate IT administration and streamline Bioinformatic Pipeline development on HPC systems.

Meet the IT and Scientific Computing team

Image of Marek Dynowski from Cancer Research UK Manchester Institute
Marek Dynowski

Core Facility Manager

non-gendered icon
John Campion

Lead Data Architect

non-gendered icon
Krar Haider

IT Service Desk Analyst (Mac & Windows)

non-gendered icon
Stephen Kitcatt

Principal Software Architect

non-gendered icon
Christopher Mccauley

Web Designer/Developer

non-gendered icon
Brian Poole

Helpdesk Manager

non-gendered icon
Matthew Young

Computer Support Officer

Get in touch

Please enable JavaScript in your browser to complete this form.
Gloved hands filling a stripette white lab coat

Latest from CRUK MI

Cancer Research In the Paterson Building

Find out more about the facilities across the Institute

Leukaemia Immunology & Transplantation

The Leukaemia Immunology and Transplantation laboratory aim to develop a comprehensive strategy to prevent post-transplant relapse in patients treated with allogeneic haematopoietic stem cell transplantation – the only curative therapy for many patients with acute myeloid leukaemia (AML) and other poor-risk haematological malignancies.

Patient derived preclinical models reveal novel biology of SCLC

Immune detection of dying tumour cells can elicit cancer immunity when the host permits it

Cancer Research In the Paterson Building
Leukaemia Immunology & Transplantation
Patient derived preclinical models reveal novel biology of SCLC

Careers that have a lasting impact on cancer research and patient care

We are always on the lookout for talented and motivated people to join us.  Whether your background is in biological or chemical sciences, mathematics or finance, computer science or logistics, use the links below to see roles across the Institute in our core facilities, operations teams, research groups, and studentships within our exceptional graduate programme.