Telephone: 0114 2572 200
Email: info@ocf.co.uk

OCF DATA signs agreement with Panintelligence to integrate Business Intelligence and Analytics

We’re pleased to announce that OCF DATA has recently signed a partnership agreement with Panintelligence.

Panintelligence PI allows users to gain a multi-dimensional view of their business, anywhere, any time, and on any device. It offers intuitive dashboards comprised from in-enterprise and third party data from various business process areas. Users drill-down into different data sets for instant analysis and insights that allow management to drive improvements in business or process performance.

Cliff Brereton, Director at OCF DATA comments, “Quite often our customers have all the answers to optimal performance contained within their data, but due to the sheer size of the data held it is impossible to extract these insights and put them to use in optimising performance without appropriate tools such as PI. This is an exciting opportunity for us to help those clients who are wanting to get started with their data journey and deliver better operational performance. Our partnership with Panintelligence will provide the technology and the know-how to make these performance improvements a reality.

The new partnership will see OCF DATA and Panintelligence provide solutions for four key industries; manufacturing, higher education, healthcare and utilities. Both companies will be present as exhibitors at MACH2018 in April, which is the UK’s leading manufacturing technologies event, showcasing how the Panintelligence dashboards can help with optimising production performance, forecasting production volumes & supply chain and enhancing customer engagement.

Zandra Moore, Sales and Marketing Director at Panintelligence commented, “We are delighted to welcome OCF DATA into the Panintelligence partner community. OCF DATA is a strategic adviser and data consultancy business, working in some of our core markets. We look forward to working together to drive the pace of change in these sectors by improving outcomes through self-service data analysis.”

Read More »

CLIMB victorious at HPC Wire Readers’ Choice Awards

A solution designed and integrated by OCF has been announced as a winner in two categories at the 2017 HPC Wire Readers’ Choice Awards.

Announced at SuperComputing 2017 in Denver, USA, the Cloud Infrastructure for Microbial Bioinformatics, (CLIMB) has won the award for ‘Best Use of HPC in Life Sciences’ and ‘Best HPC Collaboration in Academia, Government or Industry’.

CLIMB is a UK based cloud project funded by the UK’s Medical Research Council to support research by academic microbiologists.  The current live system is located across the Universities of Birmingham, Cardiff and Warwick.

The Best Use of HPC in Life Sciences Award was awarded for real-time analysis of Zika genomes using CLIMB cloud computing, supported by Lenovo, OpenStack, IBM, Red Hat and Dell EMC.  The Best HPC Collaboration in Academia, Government or Industry was awarded to CLIMB for the provision of resources for projects that globally impact public health, using the expertise of Lenovo, OpenStack, IBM Spectrum Scale, Red Hat, and Dell EMC.

Simon Thompson, Research Computing Specialist who built the initial CLIMB pilot system comments, “We are delighted that CLIMB’s achievements have been recognised by the HPC Wire Readers’ Choice Awards.  CLIMB is one of the most complex environments we’ve had to build, learning a lot of new technologies. We were able to work with our contacts at OCF, IBM and Lenovo to help us build such a successful HPC platform.”

CLIMB is one of the first multi-site OpenStack deployments in the UK dedicated to supporting the needs of research computing and remains the only truly federated academic research cloud operating in the UK. A fourth node, based at Swansea, is currently under development and due for launch later in 2017.

Since building the CLIMB cloud, the research partnership has expanded to include the Universities of Bath and Leicester as well as the BBSRC funded Quadram Institute. It’s open to all microbial researchers in the UK and has even been used to support training courses in the Gambia and Vietnam. Currently, CLIMB supports over 200 research groups from across academia, government and healthcare.

One recent project using the CLIMB infrastructure was research into the Zika virus. Professor Nick Loman, Professor of Microbial Genomics and Bioinformatics at the University of Birmingham (and CLIMB Fellow), says, “Genome sequencing can generate rapid insights into the scale and patterns-of-spread of important epidemics. When Zika struck the Americas, we were able to respond rapidly by deploying portable sequencing to affected areas, generating sequence data in days. Comparative genome sequence analysis requires significant computation and storage requirements.

The new system provides rapid, on-demand computation to speed up the analysis. We can instantly access hundreds of CPUs, thousands of gigabytes of RAM and tens of terabytes of storage. This means we can keep pace with the rapid data generation, and quickly release important new findings to the scientific and public health community in order to assist epidemic response efforts.”

The use of mobile field labs and cloud computing resources followed on from previous work with Ebola. As part of the CLIMB offering, researchers are able to access toolsets developed by other institutions including GVL (Melbourne University) and EDGE bioinformatics (Los Alamos).

With over 7,500 vCPU cores of processing power, the CLIMB system represents the largest single system designed specifically for microbiologists in the world. The CLIMB infrastructure runs OpenStack and is physically located across four University sites and is made up of Lenovo SystemX servers with either 512GB or 3TB of RAM and up to 192 cores, enabling researchers to request huge amounts of computing resource on the fly.

Storage is provided by IBM StorWise with IBM Spectrum Scale and Red Hat Ceph running on Dell servers. The pilot service for CLIMB was developed at the University of Birmingham with the architecture subsequently replicated at the other sites.

The CLIMB system was designed, and is led on a technical side by Tom Connor (Cardiff), Simon Thompson and Nick Loman (Birmingham). The system itself sits within a wider project, led by Mark Pallen (Quadram) and Sam Sheppard (Bath) that includes the provision of refurbished bioinformatics space, training courses and three research fellowships.

We look forward to continuing our support with the CLIMB project in the future. More details on the CLIMB project can be found at http://www.climb.ac.uk/.

Read More »

OCF helps to develop AI infrastructure at the University of Oxford

The University of Oxford has become the first academic institution in the UK to take delivery of an NVIDIA DGX-1 supercomputer powered by the latest GPU technology – NVIDIA Volta.

Picking up a Petaflop: (L-R) Dr. David Jenkins, Head of Research Computing and Support Services University of Oxford, Dr. Steven Young, ARC Technical Services Manager and Dr. Robert Esnouf, Director of Research Computing BDI & Head of Research Computing Core WHG

The new system has been supplied by OCF and funded via a collaboration between the University’s IT Services department, Wellcome Centre for Human Genetics (WHG), Big Data Institute (BDI) and the Weatherall Institute of Molecular Medicine (WIMM).

The system will be housed and managed by the University’s Advanced Research Computing (ARC) facility and is a response to the explosion in demand from researchers for exploring all avenues for applying deep learning to research. For the life sciences, current research includes more accurate sequencing, predicting gene expression levels, simulating brain activity, predicting outbreaks of diseases such as malaria and analysing population-scale data such as those from the UK Biobank study. In other disciplines it includes research into autonomous vehicles, natural language processing and computer vision.

“Oxford University is at the forefront of AI research in Europe, so it’s fitting this should be the first academic institution in the UK to receive a DGX-1 powered by our Volta architecture,” comments Stuart Wilson, Supercomputing & AI Director UK for NVIDIA. “The DGX-1 installation forms the centrepiece of the University’s shared AI infrastructure. It will be extremely exciting to see how students, researchers and developers at the University use this AI supercomputer to further their ground-breaking work.”

“OCF thrives on working with our customers to provide the latest technologies to enhance research,” comments Julian Fielden, OCF Managing Director. “We’re delighted to be bringing together researchers from across all divisions of the University of Oxford to enable them to carry out such exciting projects.”

OCF has been a business partner with NVIDIA for over a decade and recently achieved Elite Partner level status with NVIDIA for Accelerated Computing, becoming only the second business partner in northern Europe to achieve this level.

Read More »

OCF supercomputer speeds up research at the University of Exeter

Researchers from across the University of Exeter are benefitting from a new High Performance Computing (HPC) machine, called Isca. Existing departmental HPC resources within Life Sciences and Physics were coming to the end of life, so using funding from the University and a large grant from the Medical Research Council, the University acquired a new, central core HPC resource to support researchers University-wide across numerous disciplines.

The new system has already been contributing to research into the modelling and formation of stars and galaxies, using Computational Fluid Dynamics (CFD) within Engineering to understand how flooding affects bridges, as well as being used in the Medical School looking at genetic traits in diabetes using data from the UK Biobank. The HPC resource is now in use by more than 200 researchers across 30+ active research projects in Life Sciences, Engineering, Mathematics, Astrophysics, and Computing departments.

As part of the original tender, the University asked for options to provide temporary housing for the new HPC machine whilst work on a new data hall was being finished. High performance compute, storage and data analytics integrator, OCF, proposed a unique solution to house the new HPC machine in a Rapid Deployment Data Centre (RDDC) container solution from Stulz Technology Integration Limited (formerly TSI UK).

Nicknamed The Pod, this is a dedicated HPC-related containerised solution developed by Stulz Technology Integration, which was custom fabricated for the University by the data centre design and build specialists. OCF designed, integrated and configured the HPC machine and had the entire system delivered in its container to the University in July 2016.

“This was phase one of the new supercomputer, located on campus in the specialised container, where the machine ran for the first twelve months,” commented David Barker, Technical Architect at the University of Exeter. “We tested and used the system while it was housed in the temporary location to give us an understanding of what we used a lot of; this informed phase two of the project which was to expand the system with the help of OCF and move it to its final location in the new data centre on campus.”

In addition, OCF and Lenovo jointly worked on the design of the computer to support the differing needs of the life sciences and physics researchers, which required virtualised and traditional environments respectively. The new 6,000 core system is comprised of Lenovo NeXtScale servers with a number of NVIDIA GPU nodes, Intel Xeon Phi processors and additional high memory compute resources to cater to these needs,

Lenovo’s NeXtScale servers are connected through Mellanox EDR Infiniband to three GS7K parallel file system appliances from DDN Storage, providing just under one petabyte of capacity. OCF’s own Open Source HPC Software Stack, based on XCAT, runs on the system along with RDO OpenStack, NICE DCV and Adaptive Computing MOAB.

“As well as having the standard nodes, we also have various pieces of specialist kit which includes NVIDIA GPU nodes, Intel Xeon Phi nodes and OpenStack cloud nodes as well,” commented David. “We wanted to ensure that the new system caters for as wide a variety of research projects as possible, so the system reflects the diversity of the applications and requirements our users have.”

The impact it has had on research has been significant, with researchers seeing 2-3x speed up compared to the previous departmental clusters.

“We’ve seen in the last few years a real growth in interest in High Performance Computing from life sciences, particularly with the availability of new high-fidelity genome sequencers, which have heavy compute requirements, and that demand will keep going up,” comments David Acreman, Senior Research Fellow at the University of Exeter. “Isca is proving to be an incredibly valuable service to the whole university and is now proving indispensible to our research groups.”

Read More »

OCF DATA signs agreement with Atos to sell Bullion high-end servers

OCF DATA Limited has signed a partner channel agreement with Atos, a leader in digital transformation,  to resell the Atos Bullion server range to NHS Trusts and higher education institutions as well as to companies in the manufacturing and utilities sectors in the UK.

OCF DATA will now place the powerful SAP and Oracle certified 8 to 16 sockets x86 Bullion servers at the centre of its data analytics offering. Atos solutions are a highly differentiated value offer. Their modularity, high performance and flexible functionality make them the ideal product for high performance in-memory computing analytics solutions.

Andy Grant, Head of Big Data and HPC, Atos UK&I said: “We are delighted to be working with OCF DATA to bring the capabilities of Bullion Big Data servers to new market sectors. Whether it is for large scale virtualisation, high performance data analytics or in-memory computing the Bullion architecture delivers market leading price performance and RAS capabilities, helping organisations manage their costs.”

OCF DATA Director Cliff Brereton said of the partnership: “This is an exciting opportunity for us to deliver the world-class Atos infrastructure technology to IT departments, where this will form the core infrastructure of an organisation’s internal data services. Whether it is running enterprise databases, managing and exploiting data lakes or improving the performance of JAVA applications, these solutions will improve performance, reduce licence costs, footprint and power costs over existing in-house deployments”

Read More »

New Sales Leader at OCF DATA

Lee Hannis joins OCF DATA’s growing expertise in Data Analytics

OCF DATA, the analytics and insight division of OCF plc, has appointed Lee Hannis to the role of Sales Leader. Lee will engage with UK organisations in the healthcare, higher education, utilities and manufacturing industries to help design and integrate data analytics solutions using a network of world leading vendor partner technology.

Drawing on previous knowledge and experience as head of business development at the Hartree Centre, Lee will consult with customers on the challenges they face in getting valuable business insights from their data. Lee’s experience of delivering operational data analysis and cognitive solutions will help clients fast track their data strategy and implementation.

“We’re seeing a lot of interest from customers around data analytics and the potential it has to enable better business profits as well as improved outcomes for customers” comments Cliff Brereton, Managing Director at OCF DATA. “With Lee’s appointment, we’ll be able to better meet the increasing demand we’re seeing from customers.”

He adds: With Lee’s experience we are able to consult with our customers and prospects on data analytics to help them find the best solution possible – whether that is enhanced workflow management in health care or better student performance for Universities. Lee will be a valuable asset to the team at OCF DATA.”

Lee brings with him a wealth of experience; in his previous role as head of business development, he successfully led bids and developed strategic relationships with clients including Rolls Royce, Unilever, Schlumberger, Barclays and innovative industry thought leaders such as Land Rover BAR and Alder Hey Hospital trust. While responsible for the management and direction of the Business Development team, he also established the data analytics consulting practice and led the pioneering delivery of cognitive technology into healthcare. Previously, Lee led the Government and Civil Agency business unit at UK’s leading biometric and identity specialist, Human Recognition Systems. Whilst there, Lee successfully led the business into national and international contracts delivering some of the most advanced biometric systems into HMPS, UK MoD, DSTL, United Nations, Olympic Delivery Authority and other international agencies.

Read More »

iRODS Consortium welcomes OCF as newest member

We are pleased to announce that we will now have a role in future iRODS development.

We are the latest organisation to join the iRODS Consortium, the membership-based foundation that leads development and support of the integrated Rule-Oriented Data System (iRODS).

A key focus for OCF is helping research organisations meet their requirements for long-term storage of vast quantities of research data. As members of the iRODS Consortium, we will further improve its ability to deliver and support iRODS solutions and will have input into the future development of iRODS to ensure it continues to meet the needs of our customers.

“OCF has long followed the development of iRODS in the community and is excited to work with the consortium to take this to the next level,” said Andrew Dean, OCF business development manager.

The iRODS Consortium is a membership-based organisation that guides development and support of iRODS as free open source software for data discovery, workflow automation, secure collaboration, and data virtualisation. The iRODS Consortium provides a production-ready iRODS distribution and iRODS professional integration services, training, and support. The consortium is administered by founding member RENCI, a research institute for applications of cyberinfrastructure located at the University of North Carolina at Chapel Hill, USA.

“OCF has a history of providing the best data storage and data management solutions to its customers, so having them as a member of the iRODS Consortium is an excellent fit and a relationship that we look forward to building and nurturing,” said Jason Coposky, executive director of the iRODS Consortium. “We look forward to their input and participation and are pleased to be part of the data management solutions offered to so many UK organisations.”

In addition to OCF, current iRODS Consortium members include Bayer, Dell/EMC, DDN, HGST, IBM, Intel, MSC, the National Institute for Computational Science at the University of Tennessee, Panasas, RENCI, Seagate, University College London, Utrecht University, and the Wellcome Trust Sanger Institute.

To learn more about iRODS and the iRODS Consortium, please visit irods.org.

 

Read More »

OCF achieves Elite Partner status with NVIDIA

OCF has successfully achieved Elite Partner status with NVIDIA® for Accelerated Computing, becoming only the second business partner in Northern Europe to achieve this level.

Awarded in recognition of OCF’s ability and competency to integrate a wide portfolio of NVIDIA’s Accelerated Computing products including TESLA® P100 and DGX-1™, the Elite Partner level is only awarded to partners that have the knowledge and skills to support the integration of GPUs, as well as the industry reach to support and attract the right companies and customers using accelerators.

“For customers using GPUs, or potential customers, earning this specialty ‘underwrites’ our service and gives them extra confidence that we possess the skills and knowledge to deliver the processing power to support their businesses,” says Steve Reynolds, Sales Director, OCF plc. “This award complements OCF’s portfolio of partner accreditations and demonstrates our commitment to the vendor.”

OCF has been a business partner with NVIDIA for over a decade and has designed, built, installed and supported a number of systems throughout the UK that include GPUs. Most recently, OCF designed, integrated and configured ‘Blue Crystal 4’, a High Performance Computing (HPC) system at the University of Bristol, which includes 32 nodes with 2 NVIDIA Tesla P100 GPUs accelerators each.

In addition, as a partner of IBM and NVIDIA via the OpenPOWER Foundation, OCF has supplied two IBM® Power Systems™ S822LC for HPC systems, codenamed ‘Minsky’, to Queen Mary University of London (QMUL).

The two systems, which pair a POWER8 CPU with 4 NVIDIA Tesla P100 GPU accelerators, are being used to aid world-leading scientific research projects as well as teaching, making QMUL one of the first universities in Britain to use these powerful deep learning machines. The university was also the first in Europe to deploy a NVIDIA DGX-1 system, described as the world’s first AI supercomputer in a box.

Read More »

OCF works with the University of Birmingham to improve Research Data Management

Working with OCF and IBM, the University of Birmingham Research Computing Team has selected IBM’s Spectrum Scale (formerly GPFS) Data Management Edition to underpin it’s research data storage systems.

Dr John Owen, Head of Research Support at the University said, “Looking after research data is key to our researchers. The value of the research we undertake is locked away in the data and we need to ensure that our researchers have safe, secure and reliable storage to keep it”.

Research computing at the University provides a wide range of advanced computing facilities which are free for researchers to use, from the BlueBEAR HPC system, BEARCloud research cloud platform, research data services through to high performance networking to connect research equipment to the storage. At the heart of the platforms is ensuring data is looked after.

Simon Thompson, Research Computing Infrastructure Architect says, “We’ve been a long term user of Spectrum Scale for our storage systems, and it’s a very stable platform for us. We use it for both data storage and as part of our private cloud deployment – for storing VM images, it means we have a single data management plane and easily allows us to place data into different classes of storage depending on the need for access – for example with data we need to archive, we can move it to tape via the Spectrum Protect integration where users can seamlessly restore it. One of the key features of Spectrum Scale for us is that we can use different classes and vendors of storage underneath it so that we can performance optimise placement depending on the required workload.”

He continues “As we bring in more research data from local storage, we have increasing requirements for encryption, and Data Management Edition provides us with a certified platform that allows us to support encrypted data at rest. We’ll be using placement policies to define which classes of data need to be encrypted, but for our users it will be transparent.”

Data Management edition brings a new license model to Spectrum Scale and is managed by capacity of the system. As the University has grown the number of systems and amount of storage, this has made the current licenses difficult to manage. Simon continues, “Previously we had to account for client and server licenses and with a mix of different vendor and OEM licenses, this was proving difficult, in addition to this we were often designing for optimal license use, rather than the best architectural solution. With Data Management Edition, we no longer need to worry about server and client licenses, which is particularly of interest for some of the environments we are looking to develop in our private cloud infrastructure.”

Looking to the future, the University is keen to explore other features of Data Management Edition, for example “transparent cloud tiering”, which allows automatic migration of data into object storage is particularly of interest.

Working with OCF and the IBM sales team, the University was able to migrate their existing IBM standard licenses over to the new license model.

Read More »

OCF deliver new 600 Teraflop HPC machine for University of Bristol

For over a decade the University of Bristol has been contributing to world-leading and life changing scientific research using High Performance Computing (HPC), having invested over £16 million in HPC and research data storage. To continue meeting the needs of its researchers working with complex and large amounts of data, they will now benefit from a new HPC machine, named BlueCrystal 4 (BC4).

Designed, integrated and configured by the HPC, storage and data analytics integrator OCF, BC4 has more than 15,000 cores making it the largest UK University system by core count and a theoretical peak performance of 600 Teraflops.

Over 1,000 researchers in areas such as paleobiology, earth science, biochemistry, mathematics, physics, molecular modelling, life sciences, and aerospace engineering will be taking advantage of the new system. BC4 is already aiding research into new medicines and drug absorption by the human body.

“We have researchers looking at whole-planet modelling with the aim of trying to understand the earth’s climate, climate change and how that’s going to evolve, as well as others looking at rotary blade design for helicopters, the mutation of genes, the spread of disease and where diseases come from,” said Dr Christopher Woods, EPSRC Research Software Engineer Fellow, University of Bristol. “Early benchmarking is showing that the new system is three times faster than our previous cluster – research that used to take a month now takes a week, and what took a week now only takes a few hours. That’s a massive improvement that’ll be a great benefit to research at the University.”

BC4 uses Lenovo NeXtScale compute nodes, each comprising of two 14 core 2.4 GHz Intel Broadwell CPUs with 128 GiB of RAM. It also includes 32 nodes of two NVIDIA Pascal P100 GPUs plus one GPU login node, designed into the rack by Lenovo’s engineering team to meet the specific requirements of the University.

Connecting the cluster are several high-speed networks, the fastest of which is a two-level Intel Omni-Path Architecture network running at 100Gb/s. BC4’s storage is composed of one PetaByte of disk provided by DDN’s GS7k and IME systems running the parallel file system Spectrum Scale from IBM.

Effective benchmarking and optimisation, using the benchmarking capabilities of Lenovo’s HPC research centre in Stuttgart, the first of its kind, has ensured that BC4 is highly efficient in terms of physical footprint, while fully utilising the 30KW per rack energy limit. Lenovo’s commitment to third party integration has allowed the University to avoid vendor lock-in while permitting new hardware to be added easily between refresh cycles.

Dr Christopher Woods continues: “To help with the interactive use of the cluster, BC4 has a visualisation node equipped with NVIDIA Grid vGPUs so it helps our scientists to visualise the work they’re doing, so researchers can use the system even if they’ve not used an HPC machine before.”

Housed at VIRTUS’ LONDON4, the UK’s first shared data centre for research and education in Slough, BC4 is the first of the University’s supercomputers to be held at an independent facility. The system is directly connected to the Bristol campus via JISC’s high speed Janet network. Kelly Scott, account director, education at VIRTUS Data Centres said, “LONDON4 is specifically designed to have the capacity to host ultra high density infrastructure and high performance computing platforms, so an ideal environment for systems like BC4. The University of Bristol is the 22nd organisation to join the JISC Shared Data Centre in our facility, which enables institutions to collaborate and share infrastructure resources to drive real innovation that advances meaningful research.”

Currently numbering in the hundreds, applications running on the University’s previous cluster will be replicated onto the new system, which will allow researchers to create more applications and better scaling software. Applications are able to be moved directly onto BC4 without the need for re-engineering.

“We’re now in our tenth year of using HPC in our facility. We’ve endeavoured to make each phase of BlueCrystal bigger and better than the last, embracing new technology for the benefit of our users and researchers,” commented Caroline Gardiner, Academic Research Facilitator at the University of Bristol.

Simon Burbidge, Director of Advanced Computing comments: “It is with great excitement that I take on the role of Director of Advanced Computing at this time, and I look forward to enabling the University’s ambitious research programmes through the provision of the latest computational techniques and simulations.”

Due to be launched at an event on 24th May at the University of Bristol, BC4 will house over 1,000 system users, carried over from BlueCrystal Phase 3.

Read More »

Recent Comments

    December 2017
    Monday Tuesday Wednesday Thursday Friday Saturday Sunday
    27th November 2017 28th November 2017 29th November 2017 30th November 2017 1st December 2017 2nd December 2017 3rd December 2017
    4th December 2017 5th December 2017 6th December 2017 7th December 2017 8th December 2017 9th December 2017 10th December 2017
    11th December 2017 12th December 2017 13th December 2017 14th December 2017 15th December 2017 16th December 2017 17th December 2017
    18th December 2017 19th December 2017 20th December 2017 21st December 2017 22nd December 2017 23rd December 2017 24th December 2017
    25th December 2017 26th December 2017 27th December 2017 28th December 2017 29th December 2017 30th December 2017 31st December 2017

    Contact Us

    HEAD OFFICE:
    OCF plc
    Unit 5 Rotunda, Business Centre,
    Thorncliffe Park, Chapeltown,
    Sheffield, S35 2PG

    Tel: +44 (0)114 257 2200
    Fax: +44 (0)114 257 0022
    E-Mail: info@ocf.co.uk

    SUPPORT DETAILS:
    OCF Hotline: 0845 702 3829
    E-Mail: support@ocf.co.uk
    Helpdesk: support.ocf.co.uk

    DARESBURY OFFICE:
    The Innovation Centre, Sci-Tech Daresbury,
    Keckwick Lane, Daresbury,
    Cheshire, WA4 4FS

    Tel: +44 (0)1925 607 360
    Fax: +44 (0)114 257 0022
    E-Mail: info@ocf.co.uk

    OCF plc is a company registered in England and Wales. Registered number 4132533. Registered office address: OCF plc, 5 Rotunda Business Centre, Thorncliffe Park, Chapeltown, Sheffield, S35 2PG

    Website Designed & Built by Grey Matter | web design sheffield