Friday, November 15, 2013

You gotta go for what you know, Make everybody see, in order to fight the power that be

So I get this question a lot. 

How many replicates do I need for a proteomic profiling experiment? 


http://www.real-statistics.com/wp-content/uploads/2012/11/statistical-power-chart.pngIt's a great question and one unfortunately I have no idea how to answer  with any accuracy.

I remember reading in a book somewhere (wish I remembered where) where the author stated that whenever he was asked this question he would say 30 biological replicates! Well that's great if you can actually generate 30 replicates and then pay for them to be analyzed....But alas in the world of proteomics, this is usually unobtainable.

So what to do....

Let me start by explaining why I have no idea how to answer this question. It all has to do with Power and how it is calculated

Power is in essence your ability to detect a change in your sample when such a change actually exists. In other words what is your chance of committing a type II error (A false negative)

This is one of the best explanations of Power calculations I have seen (it's worth reading)

http://www.refsmmat.com/statistics/power.html

Some Bullet points


  • Power depends on the number or replicates you do, your variation in your sample and the magnitude of the  of the change you are trying to detect. 

So the more replicates you analyze, the higher your chances of detecting a low magnitude change that has a large variation .

Now this can be calculated in a relatively straightforward way if you say have a small number of variables you are interested in and know your variation. I think anyway....

Now is that what we have in a proteomics or RNA-Seq experiment?

Unfortunately no

We have thousands of variables! And to make matters worse each one of those variables have their own variation and magnitude. 

And it gets even more complicated with bottom up proteomics. Are you defining your variables as peptides or proteins? Ultimately we are interested in the the proteins (most of the time). But we need to reconstruct those proteins from the peptides we identify. Each of those peptides may have a different variation, (S/N) , and magnitude of change, across samples. How do we calculate power on a protein level....I have no idea.

For example 

If you have a peptide that has a S/N of 1000:1 and a %CV of 5%, then you would need less replicates to detect a change in that peptide (depending on the magnitude of the change) than a peptide with a S/N of 2:1 with a %CV of 50%.  Now lets say both of those peptides map back to the same protein....How do you calculate the power of that?  This can actually be quite common as peptides can vary in their ionization  by 4-5 orders of magnitude maybe (I need to look up  he cite here) but I think that;s a reasonable guess and their ability to be cut by trypsin can vary by a lot as well (proteotypic vs non-proteotypic) 

There has been a few papers on calculating power of proteomics experiments over the years, this one being one of the better ones. 


If you look at this figure From the above paper


You can see a larger version of this figure in the paper, but essentially if you can somehow know that your protein has a 50% variation between groups (how you calculate this I  have no idea...) at a power of 0.8 (one in five chance of detecting it) you would need 15 biological replicates per group to detect a 1.5 fold change. If you have a 25% variation you would only need 5 replicates. The good news is if you have a 100% variation you may need only 5 replicates to detect a 3 fold change. I think that's probably realistic...

Does the protein you are interested in have a 50% variation, 25% variation, 5% variation? I really have no idea. Most likely in your 1000+ plus proteins we can identify you will have some with a low variation (usually the high abundant and less interesting ones) and some with a large variation (less abundant more interesting ones) . 

One can only get a handle on their variation after the experiment is run. You can begin to calculate your %CV's and S/N (1/CV) but I don't know how to do this before the experiment is run. 

So more replicates the better, less variation the better. But it's still possible that you will need that 30+ replicates to see that < 1 fold change.....



Wednesday, November 13, 2013

Nano-LC isn't easy but it's necessary so I’m chasing peaks like Tom chases Jerry


Man if I had a quarter for every time my Nano-LC system Failed. I mean it’s broken more than it works. And how many do I have 5 of them?


Instead of going all Office Space on my Nano-LC’s , I thought I’d praise the virtues of it and why you need to just bite the bullet, and live with it.


Everyone who has done it knows how painful it is, and if it isn’t painful, you are either not pushing it to its limits or maybe just injecting clean standards and not real dirty biological samples.  If Nano-LC wasn't an issue, you would not have companies touting their Easy-LC (which is anything but Easy), their chip based systems or their direct spray systems like Bruker’s acquisition (and ultimate destruction) of Michrom’s Captive spray system (RIP). I’ve tried all of them and they generally are not worth it.



Let’s get to the virtues of nano-LC  first.


  1. You really get a sense of accomplishment whenever it works.
  2. You can Make most of the parts yourself (and you really should)
  3. It really is more sensitive than anything else.
  4. If you don’t take labor into account it’s way cheaper.
  5. Job security, as it does really take a lot of practice/skill  to get really good at it.


Some of the Downsides


  1. Making the parts and troubleshooting can get very expensive if you take labor into account
  2. Very easy to overload your columns.
  3. The autosamplers on Nano-LC systems generally suck. I still know people who do not use them and bomb load everything
  4. When it is working, you should not talk about it or stare at it intently. The Nano-LC system will know and break on you.
  5. Reproducibility can be poor during long runs of samples as the column and spray tip gradually deteriorates.


I’m generally skeptical when people say they run their nano-spray system for months nonstop on real biological samples. I guess if you are not too concerned with RT and sensitivity reproducibility I guess this is possible,. But if you say run standards every 5-10 runs, you’ll see how much things can go downhill fast. Remember every time you are injecting a sample you are changing your column. That carryover you see is because you are not recovering 100% of your sample from your LC system . And that carry over changes your system in subtle and not so subtle ways.


Here are some tips I have come up with over the years


  • Making your own columns, traps and packing them is not that hard. I’ll Put up a tutorial some day. If you buy a nano-column from a company, they are most likely using a Sutter Laser puller and packing using a pressure vessel, just like you can do.
  • Most people use a Vented column set up with a Tee or a Cross (for the HV)
    • Watch out for those clogging, they are a major point of failure.
  • Main points of failure on your LC system (depending on how they work)
    • Rotors (rotor seals mainly)
    • Piston Seals
    • Check Valves
    • Tubing getting clogged
      • Which is why I do not recommend PeekSIl
    • Fittings not connected correctly and introducing dead-volume

Just remember, it gets easier if you just getting started. and don’t be tempted with these all in one nano-solutions you see for sale. They will ultimately fail in some way and you will have a hard time fixing the problem yourself if you do not know how it works. When you have samples piling up, it’s far better if you can fix it yourself. Of course there is always going to be real hardware failures you can’t fix, but most of the routine problems you can

Thursday, September 19, 2013

Scientific Core Facilities

I could not find another blog about core facilities so I though I might as well make one....Why not right?

So where to start....

I direct a scientific core facility (Proteomics Core specifically) and I have been for what seems like a long time....Well more than a decade now... I've actually been associated with Core facilities in one form or another since 1995 or so....

It's a rather unique  profession and certainly a non-traditional one when compared to the traditional

post-doc --> assistant professor ---> associate professor ---> professor track

that most Ph D. scientists take or at least think they might take when they start grad school (at least they did in 1995...maybe not so much any more).

So how did I end up here. Well like most people that work in core facilities it was part accident and part because I found early in my career that I really loved being in a core facility.

It started sort of like this. In grad school I was put on a project to study two proteins that are in the Sindbis virus membrane

Pretty cool huh

Anyway, I had no idea how to do that or what to do...My PI mentioned I should talk to the people in the core facility across campus that had this new thing called a MALDI mass spectrometer (this was circa 1995 BTW).








So I walk into the core and see this giant beast of a machine with a huge laser and a joystick and I say to myself, wow this is the coolest thing I've ever seen! And I can learn how to use it? Really?

The Vestec is so old now I can't even find a picture of it any more. It was actually a precursor to the Voyager  (see the Joystick over  there --> )  But not nearly as refined or even functional. Plus it didn't have those white panels on the machine so you could see the flight tube and all the electronics. Mass specs look so much cooler without those stupid panels...but I digress.

The samples had to be put into this beast on small pins and you could not really aim the laser so much as you had to rotate  the pins to get a good crystal under your laser beam. The data actually had to be recorded in part on an oscilloscope, although it did have a windows data system of some sort. The details are a bit fuzzy ... it was a long time ago....  At the time it was truly revolutionary (at least to me) and I loved using every second of it...and still do.

See the Vestec was in a core facility because these machines are usually very expensive and only a few labs could really afford to have their own.

Think of a core facility as a shared resource where other scientists can use the expertise in the core and have access to the equipment without them having to buy the equipment (500k-1M+) or train their own grad student or post doc in the fine arts of using such a machine or in planning the proper experiment to take advantage of it

So that's initially how I started down the long track of doing science in a core facility . Next time I'll talk about the ABRF and how they helped me get through grad school.