-->

Is there any way to increase memory assigned to ju

2020-06-25 05:27发布

问题:

I am using python3.6

My jupyter notebook is crashing again and again when I try to run NUTS sampling in pymc3.

My laptop has 16gb and i7 I think it should be enough. I ran same code on 8gb and i7 laptop and it worked that time. Not able to fig out what the issue is in this one.

I have generated the config file for jupyter with this command

$ jupyter notebook --generate-config

I am not able to fig out which parameter I need to modify to tackle this issue.

This is code I am using

with pm.Model() as model:
#hyperpriors
home = pm.Flat('home') #flat pdf is uninformative - means we have no idea
sd_att = pm.HalfStudentT('sd_att', nu=3, sd=2.5)
sd_def = pm.HalfStudentT('sd_def', nu=3, sd=2.5)
intercept = pm.Flat('intercept')

# team-specific model parameters
atts_star = pm.Normal("atts_star", mu=0, sd=sd_att, shape=num_teams)
defs_star = pm.Normal("defs_star", mu=0, sd=sd_def, shape=num_teams)

# To allow samples of expressions to be saved, we need to wrap them in pymc3 
Deterministic objects
atts = pm.Deterministic('atts', atts_star - tt.mean(atts_star))
defs = pm.Deterministic('defs', defs_star - tt.mean(defs_star))

# Assume exponential search on home_theta and away_theta. With pymc3, need to 
rely on theano.
# tt is theano.tensor.. why Sampyl may be easier to use..
home_theta = tt.exp(intercept + home + atts[home_team] + defs[away_team])  
away_theta = tt.exp(intercept + atts[away_team] + defs[home_team])

# likelihood of observed data
home_points = pm.Poisson('home_points', mu=home_theta, 
observed=observed_home_goals)
away_points = pm.Poisson('away_points', mu=away_theta, 
observed=observed_away_goals)

Also this is the error sc:

回答1:

Yes, you can use the following command after activating your environment:

jupyter notebook --NotbookApp.iopub_Data_Rate_Limit=1e10

If you need more or less memory change 1e10. By default it is 1e6.



回答2:

Actually this is not memory issue.

Jupyter was getting this error for many reasons like browser issues whenever it run on SAFARI this error raise. Same issue on Google Chrome if it is not default browser.

Jupyter right now do not work with tornado server version 6.0.1, use another version of tornado.