Pages

Wednesday, 31 August 2011

Balancing privacy of individuals against opportunities to increase awareness of Cancer

There has been a lot in the news recently about Steve Jobs fight with Pancreatic cancer. The impact on Apple has been discussed at length (BBC apple coverage, Reuters, BBC, CNN, Apple, GenomeWeb), as has the possible impact on Steve Jobs himself. I guess this is just one of the problems of being a VIP, celebrity, etc.

I do struggle a little however with the fact that this is one person amongst about 280,000 worldwide who get cancer of the Pancreas. It is a horrible disease and only about 16% of people live for more than one year after diagnosis. Research on Pancreatic Cancer is being done and we are learning more about the disease and why it might be so difficult to treat. Science recently published a paper from Professor David Tuvesons group here at CRI demonstrating how Pancreatic cancers are hypovascular and that increasing that vascalature can allow more effective treatment in mice.

Other celebrities have had cancer of course. There was a lot of news coverage when Kylie Minogue was diagnosed with Breast Cancer. At the time CRUK reported that the public undestanding of Breast Cancer was quite wrong, with most people thinking that the disease is more likely to affect people under 70. they termed this "the Kylie effect".

If Mr Jobs news worthiness as a pancreatic cancer patient increases public awareness, focuses more researchers on this cancer and results in more research £, $ or € being spent then we can hope for more advances and better treatment for all.

I cant help but think that headlines like GenomeWebs "See you later, Steve Jobs" are not the right thing to be writing about anyone with Cancer though.

Tuesday, 30 August 2011

The NIH view on core facilities Part 2: what does it mean for you and I

What should a core offer?
          The authors of the Science TranslationalMedicine paper list four things a core needs: (i) sophisticated instruments, (ii)staff expertise in their operation, (iii) expertise in analysing the data and (iv) an ability to provide advice through consultation with users.
          In my case the volume of samples being processed and the varied services being provided in my core make number (iii) difficult. There is a separate Bioinformatics core here at CRI so much of the analysis within my lab focuses on primary QC of data to make sure project data are of high quality before it is returned for secondary and tertiary analysis. This QC can be a job in itself, I rarely get time to actually analyse data and when I do I'm not far past page one for most packages. I try to make sure I understand the impact of biology on the experiment, how the analysis works in basic terms and consider how we can affect the results by changing what we do in the lab by tweaking the application.
           I completely agree with the authors view that the consultation and advice provided by core labs can be as important as any actual data provided. I'd be interested to hear the view of core users on this as well. Core heads can be significantly more up-to-date on research methods and applications than users. From a sequencing stand point a PI knows they want to sequence something but it can be a long time since they did this themselves and the state of the art has changed fundamentally. Having someone local who can answer questions, suggest alternative methods and give impartial advice and feedback helps. I'd also like to think that when talking to someone in a local core people are less afraid of asking what he or she might consider "stupid" questions.

How much money is spent on core facilities?
          Apparently the NIH spends about $900m a year on core facilities. The paper produces data from an analysis of 520 P30 (core infrastructure support) awards for 2010 totalling $637m. The $900m assumes a 40% underestimate of activity because of the difficulty in collecting this data. That seems like a huge margin for error to me and I'd suggest to the NIH they find a better way of recording and accounting for core activity.
          It was interesting to me to find out that the NIH spent $300M of recovery act investment on shared and/or high-end instrumentation and that more confocal microscopes, mass specs or biomedical imagers were bought then next-gen sequencers. I guess this just shows how much I have my head in the sand when it comes to other research areas outside genomics. I also can't help but wonder how many more genomes we could have sequenced and does the world need that many confocal microscopes?

How are UK core facilities organised?
          In the UK there are examples of small, medium and large cores. Some offer services to single groups or institutes, others multiple institutes or anyone in the UK. Some operate on a consumables only recovery model and others work towards full economic cost recovery. So the UK has every possible type of core according to the NIH models, can we say which works best in the UK?
          There are examples of all these in how UK next-generation sequencing is provided with over half of instruments being in large-medium sized core-labs. There are around 30-40 next-generation sequencing labs in the UK that together have over 120 instruments (65% Illumina, 20% Roche, 15% Life Technologies). The largest funder of these is the Wellcome Trust with the Wellcome Trust Sanger Institute the largest single lab in the UK (35 instruments), however services are mainly internal to the WTSI. The MRC has decided on a distributed hub model with 4 centres (collectively nearly 30 instruments). BBSRC has funded the Genome Analysis Centre (8 instruments) to provide services to all BBSRC grant holders. And CRUK, who I work for, has invested in 2 core labs (collectively 5 instruments), CRI where the sequencing is a collaborative service across four local Institutes and LRI that primarily offers services internally. There are at least another 50 instruments in smaller UK labs. (All data from googlemap).
          I would not argue that the smaller labs should be consolidated with the larger ones, nor that the UK should have one überlab but I doubt all 120 instruments are being used at maximum capacity and the total investment in instruments has been huge. I estimate a three year total for next-generation sequencing spend as likely to be over £100M. With £40M on instrument purchases alone (not including upgrades of obsolete instruments e.g. GA to HiSeq or SOLiD to 5500XL). My estimate is based on a 25% discount for instruments and consumables plus around 100 staff engaged in running or supporting the sequencing. Could the UK, or any other country achieve more "bang for its buck"?

How do you find a core?
          There is not a single place to go for information. Certain communities have their own resources and the funding agencies don't put a lot of effort into this. I already suggested that Google might not be the best place to start as a search on 'DNA sequencing core facility' returns 1.2m hits!
          I am happy to point out my conflict of interest before suggesting a great example is the Google map of next-gen sequencers. This has over 500 facilities on it but of course is limited to next-gen sequencing. We have been thinking about extending this to other technologies and making more distinction between academic stand alone and/or core labs versus fully commercial facilities. I'd be happy to get readers of this blogs views on that idea.
          Both the ABRF and the Vermont Genetics Network host databases of core facilities. They can be more comprehensive if harder to navigate and the information is perhaps more comprehensive. Two new projects funded by NCRR are VIVO and eagle-i. Both of which are aiming to catalogue information about core facility people and resources. However most users find a core because it is in their institution or down the hall in the university. And many times they seek out a particular lab through word-of-mouth recommendation.
          It pays for us as core lab managers to run our lab as efficiently and courteously as possible, always generating the highest quality data, whilst maintaining up to date applications and technologies, for as little money as possible and still keeping one eye open for that next career move. An impossible job perhaps?

          Yes. But one I still happily get out of bed for every day.

The NIH view on core facilities

A recent commentary article in Science TranslationalMedicine on core facilities is probably of interest to those other core lab directors reading my blog. I wanted to look at some of the points they raise and give my thoughts as a core facility director.

Gregory Faber from NIH and Linda Weiss from NCI have written a very nice piece on the need for efficient and organised core facilities. They outline what has been going on at the NIH to "strengthen core facilities". And discuss four focus areas; (i) information underload - making available better quality information about cores, (ii) paving the core career path - developing career paths and training for core heads, (iii) government regulation - improving understanding of regulatory requirements and (iiv) core facility fusion - aiding core centralisation and consolidation.

(i) Information underload: it needs to be easier to find information about core facilities, where they are, what services they offer etc. Of course this sounds simple, but a search on Google for 'core facility' returns 7m results, whilst the more specific 'DNA sequencing core facility' still returns 1.2m!
        There are two current databases and two new initiatives mentioned in the paper. Databases of core facilities are available from; ABRF and the Vermont Genetics Network . The new projects funded by NCRR are VIVO and eagle-i. I'd like to take the chance to mention the Google map of next-gen sequencers . This has 500 facilities on it and has not cost the American taxpayer a cent. I'd happily apply for an NIH grant to make this available under their umbrella.

(ii) Paving the core career path: Core facility heads need to receive better training in the business aspects of running an SME. There is more and more formal training for new group leaders but a core director has to work in a very different setting. There is also often no obvious career path for Core staff and they are often employed on soft money. Improvements in both of these would help to stabilise core staff.

(iii) Government regulation: Cores may have to follow government regulations on cross-charging. There are three main funding schemes in cores; reagents only, reagents + service contracts (with or without some staff time), or full economic cost recovery. It is not clear which model offers the best value for money and each can have consequences on core demands. For instance many core directors will have been faced with the need to cover costs on low demand services which results in costs increasing to a level where those services are effectively priced out of the market. This is fine from a business perspective but only if the financial aspects are being considered, there can be other important factors in keeping a service local even if demand is low. This needs to be a local decision so the local stakeholders may need to pay more.

(iv) Core facility fusion: There is little evidence to determine whether large centralised cores offer better value than small decentralised ones. The authors discuss briefly the pros and cons of different management structures, of centralisation versus distribution and the requirement of users to have a core that understand their needs. Of particular interest to the US readers is probably the discussion on consolidation of cores where the research population is low and the IDeA core labs.

Whilst the paper is written from the perspective of current funding levels and a need to save research $, £ or €, much of that discussed has probably been thought about by core facilities or their users for the past 15-20 years.

They state that access to scientific core facilities, with sophisticated instruments and experienced staff has become essential for much clinical and translational research. I'm sure anyone who has had access to a high quality core would argue this for any area of scientific research. A good core should make science easier, allowing research staff to move quickly between different and difficult techniques to get useful and robust data. As an aside I'd recommend you take a look at "What kind of core areyou"?
 
My summary:
The main discussion point of the paper is that cores should be run as efficiently as possible, I don't think anyone would argue with that suggestion. This requires that cores monitor and record their efficiency and have external benchmarks to compare to. I am not aware of any external benchmarking data for this purpose; perhaps we should start publishing some?

I'd add one more thing into the core facility fusion space, or even on its own; Lab management systems. LIMs are used in many vcore facilities to track samples and data generated from them. Considering the investment in current genomics technologies there is a real lack of joined-up thinking on LIMs for this. Most labs use their own in-house developed systems. Some buy commercially and a few try to instil academically developed LIMs. I think the NIH could do well to invest a few research$ into LIMs developmetnt for common technolgies.

Many cores have grown over several years and this growth may not always have been planned with ruthless efficiency. Cores can also be duplicated in a local region for multitude reasons. Maximising efficiency and minimising duplication are obvious ways to save money by having fewer, probably larger cores. However many institutes or group leaders will want to have a duplicate facility and if the funding agencies are not more ruthless in finding examples of and rejecting that duplication then it will continue.

There appears to be a cycle in core facilities similar to the mergers and breakups seen in the commercial sector. Sometimes small cores located close to research groups are in favour, a few years later and big centralised cores appear to be "de rigeur". Normally each country and technology appears to be at different parts of the cycle. The recent economic collapse seems to have pushed funding agencies to all think about saving money, and core facilities are an option along with everything else.

From my experience the medium sized collaborative core, with more than one major stakeholder, is the one that works best. Having several large investors with a real interest in the core succeeding has enabled high quality investment in equipment and staff. It has allowed us to reinvest as technology moves on. It has also provided a larger pool of users, bringing interesting science into the core and making sure we run as close to actual capacity as possible.