Getting No loop matching the specified signature a

2020-07-03 07:45发布

问题:

I'm a beginner to python and machine learning . I get below error when i try to fit data into statsmodels.formula.api OLS.fit()

Traceback (most recent call last):

File "", line 47, in regressor_OLS = sm.OLS(y , X_opt).fit()

File "E:\Anaconda\lib\site-packages\statsmodels\regression\linear_model.py", line 190, in fit self.pinv_wexog, singular_values = pinv_extended(self.wexog)

File "E:\Anaconda\lib\site-packages\statsmodels\tools\tools.py", line 342, in pinv_extended u, s, vt = np.linalg.svd(X, 0)

File "E:\Anaconda\lib\site-packages\numpy\linalg\linalg.py", line 1404, in svd u, s, vt = gufunc(a, signature=signature, extobj=extobj)

TypeError: No loop matching the specified signature and casting was found for ufunc svd_n_s

code

#Importing Libraries
import numpy as np # linear algebra
import pandas as pd # data processing
import matplotlib.pyplot as plt #Visualization


#Importing the dataset
dataset = pd.read_csv('Video_Games_Sales_as_at_22_Dec_2016.csv')
#dataset.head(10) 

#Encoding categorical data using panda get_dummies function . Easier and straight forward than OneHotEncoder in sklearn
#dataset = pd.get_dummies(data = dataset , columns=['Platform' , 'Genre' , 'Rating' ] , drop_first = True ) #drop_first use to fix dummy varible trap 


dataset=dataset.replace('tbd',np.nan)

#Separating Independent & Dependant Varibles
#X = pd.concat([dataset.iloc[:,[11,13]], dataset.iloc[:,13: ]] , axis=1).values  #Getting important  variables
X = dataset.iloc[:,[10,12]].values
y = dataset.iloc[:,9].values #Dependant Varible (Global sales)


#Taking care of missing data
from sklearn.preprocessing import Imputer
imputer =  Imputer(missing_values = 'NaN' , strategy = 'mean' , axis = 0)
imputer = imputer.fit(X[:,0:2])
X[:,0:2] = imputer.transform(X[:,0:2])


#Splitting the dataset into the Training set and Test set
from sklearn.cross_validation import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X,y,test_size = 0.2 , random_state = 0)

#Fitting Mutiple Linear Regression to the Training Set
from sklearn.linear_model import LinearRegression
regressor = LinearRegression()
regressor.fit(X_train,y_train)

#Predicting the Test set Result
y_pred = regressor.predict(X_test)


#Building the optimal model using Backward Elimination (p=0.050)
import statsmodels.formula.api as sm
X = np.append(arr = np.ones((16719,1)).astype(float) , values = X , axis = 1)

X_opt = X[:, [0,1,2]]
regressor_OLS = sm.OLS(y , X_opt).fit()
regressor_OLS.summary() 

Dataset

dataset link

Couldn't find anything helpful to solve this issue on stack-overflow or google .

回答1:

try specifiying the

dtype = 'float'

When the matrix is created. Example:

a=np.matrix([[1,2],[3,4]], dtype='float')

Hope this works!



回答2:

As suggested previously, you need to ensure X_opt is a float type. For example in your code, it would look like this:

X_opt = X[:, [0,1,2]]
X_opt = X_opt.astype(float)
regressor_OLS = sm.OLS(endog=y, exog=X_opt).fit()
regressor_OLS.summary()


回答3:

Downgrading from NumPy 1.18.4 to 1.15.2 worked for me: pip install --upgrade numpy==1.15.2



回答4:

Faced the similar problem. Solved the problem my mentioning dtype and flatten the array.

numpy version: 1.17.3

a = np.array(a, dtype=np.float)
a = a.flatten()