BEAR - Supporting Research at the University of Birmingham

Date/Time:
Tuesday 12 July 2016
6.00pm for 6.30pm - 8.00pm
Buffet from 6.00pm

Venue:
Tally Ho Sports and Conference Centre , Pershore Road, Edgbaston, Birmingham , B5 7RN

Cost:
Free and open to all.
Bookings after 5 July are not guaranteed for the buffet.

Speaker:
John Owen, IT Services, University Of Birmingham

Details:

John began his career as a research physicist and gradually took on responsibility for the computing resource of his research group (a cluster of HP 2100 machines for anyone with an interest in history!). This preceded a move to the central IT Services department at the University of Birmingham where he has held several positions and latterly was ‘Head of IT Facilities Management’ with a brief that covered servers, storage and data centres.

He has maintained his interest in the research activities of the University and has driven an expansion in IT resources to support research. This resulted in the creation of a new IT ‘Research Support Section’ last summer, which he now heads. The Section’s activities focus around high performance computing and big data management. Collectively these services are known as ‘BEAR’ - the Birmingham Environment for Academic Research

This presentation will describe the IT services available to researchers at the University of Birmingham. Collectively known as BEAR - the Birmingham Environment for Academic Research ( www.birmingham.ac.uk/bear ) - these cover high performance computing (HPC) and big data management, as well as a plethora of other activities.

HPC is currently delivered through a fairly traditional (2000 core) Linux batch system, but during the summer we will launch our innovative private cloud service based around OpenStack technology and running on water cooled servers. This is attracting a lot of attention both nationally and internationally.

Storage requirements are growing rapidly with demand particularly from geneticists and computational biologists. We currently provide a few Petabytes of storage with projected needs in excess of 20 Petabytes by 2020. Securely managing such volumes presents significant operational challenges.

But our user base is not just from the scientific community - we even have people from Theology running HPC jobs! Some case studies will be described.