i am having some trouble masking a panel in the same way that I would a DataFrame. What I want to do feels simple, but I have not found a way looking at the docs and online forums. I have a simple example below:
import pandas
import numpy as np
import datetime
start_date = datetime.datetime(2009,3,1,6,29,59)
r = pandas.date_range(start_date, periods=12)
cols_1 = ['AAPL', 'AAPL', 'GOOG', 'GOOG', 'GS', 'GS']
cols_2 = ['close', 'rate', 'close', 'rate', 'close', 'rate']
dat = np.random.randn(12, 6)
dftst = pandas.DataFrame(dat, columns=pandas.MultiIndex.from_arrays([cols_1, cols_2], names=['ticker','field']), index=r)
pn = dftst.T.to_panel().transpose(2,0,1)
print pn
Out[14]:
<class 'pandas.core.panel.Panel'>
Dimensions: 2 (items) x 12 (major_axis) x 3 (minor_axis)
Items axis: close to rate
Major_axis axis: 2009-03-01 06:29:59 to 2009-03-12 06:29:59
Minor_axis axis: AAPL to GS
I now have a Panel object, if I take a slice along the items axis, I get a DataFrame
close_p = pn['close']
print close_p
Out[16]:
ticker AAPL GOOG GS
2009-03-01 06:29:59 -0.082203 -0.286354 1.227193
2009-03-02 06:29:59 0.340005 -0.688933 -1.505137
2009-03-03 06:29:59 -0.525567 0.321858 -0.035047
2009-03-04 06:29:59 -0.123549 -0.841781 -0.616523
2009-03-05 06:29:59 -0.407504 0.188372 1.311262
2009-03-06 06:29:59 0.272883 0.817179 0.584664
2009-03-07 06:29:59 -1.767227 1.168876 0.443096
2009-03-08 06:29:59 -0.685501 -0.534373 -0.063906
2009-03-09 06:29:59 0.851820 0.068740 0.566537
2009-03-10 06:29:59 0.390678 -0.012422 -0.152375
2009-03-11 06:29:59 -0.985585 -0.917705 -0.585091
2009-03-12 06:29:59 0.067498 -0.764343 0.497270
I can filter this data in two ways:
1) I create a mask and mask the data as follows:
msk = close_p > 0
close_p = close_p.mask(msk)
2) I can just slice by the boolean operator in msk above
close_p = close_p[close_p > 0]
Out[28]:
ticker AAPL GOOG GS
2009-03-01 06:29:59 NaN NaN 1.227193
2009-03-02 06:29:59 0.340005 NaN NaN
2009-03-03 06:29:59 NaN 0.321858 NaN
2009-03-04 06:29:59 NaN NaN NaN
2009-03-05 06:29:59 NaN 0.188372 1.311262
2009-03-06 06:29:59 0.272883 0.817179 0.584664
2009-03-07 06:29:59 NaN 1.168876 0.443096
2009-03-08 06:29:59 NaN NaN NaN
2009-03-09 06:29:59 0.851820 0.068740 0.566537
2009-03-10 06:29:59 0.390678 NaN NaN
2009-03-11 06:29:59 NaN NaN NaN
2009-03-12 06:29:59 0.067498 NaN 0.497270
What I cannot figure out how to do is filter all of my data based on a mask without a for loop. I can do the following:
msk = (pn['rate'] > 0) & (pn['close'] > 0)
def mask_panel(pan, msk):
for item in pan.items:
pan[item] = pan[item].mask(msk)
return pan
print pn['close']
Out[32]:
ticker AAPL GOOG GS
2009-03-01 06:29:59 -0.082203 -0.286354 1.227193
2009-03-02 06:29:59 0.340005 -0.688933 -1.505137
2009-03-03 06:29:59 -0.525567 0.321858 -0.035047
2009-03-04 06:29:59 -0.123549 -0.841781 -0.616523
2009-03-05 06:29:59 -0.407504 0.188372 1.311262
2009-03-06 06:29:59 0.272883 0.817179 0.584664
2009-03-07 06:29:59 -1.767227 1.168876 0.443096
2009-03-08 06:29:59 -0.685501 -0.534373 -0.063906
2009-03-09 06:29:59 0.851820 0.068740 0.566537
2009-03-10 06:29:59 0.390678 -0.012422 -0.152375
2009-03-11 06:29:59 -0.985585 -0.917705 -0.585091
2009-03-12 06:29:59 0.067498 -0.764343 0.497270
mask_panel(pn, msk)
print pn['close']
Out[34]:
ticker AAPL GOOG GS
2009-03-01 06:29:59 -0.082203 -0.286354 NaN
2009-03-02 06:29:59 NaN -0.688933 -1.505137
2009-03-03 06:29:59 -0.525567 NaN -0.035047
2009-03-04 06:29:59 -0.123549 -0.841781 -0.616523
2009-03-05 06:29:59 -0.407504 NaN NaN
2009-03-06 06:29:59 NaN NaN NaN
2009-03-07 06:29:59 -1.767227 NaN NaN
2009-03-08 06:29:59 -0.685501 -0.534373 -0.063906
2009-03-09 06:29:59 NaN NaN NaN
2009-03-10 06:29:59 NaN -0.012422 -0.152375
2009-03-11 06:29:59 -0.985585 -0.917705 -0.585091
2009-03-12 06:29:59 NaN -0.764343 NaN
So the above loop does the trick. I know there is a faster vectorized way of doing this using the ndarray, but I have not put that together yet. It also seems like this should be functionality that is built into the pandas library. If there is a way to do this that I am missing, any suggestions would be much appreciated.