[ML]주택가격예측(EDA+keras)

Author : tmlab / Date : 2017. 12. 29. 19:14 / Category : Analytics

In [3]:
import os 
import pandas as pd
os.chdir("D:\STUDY\kaggle" )

train = pd.read_csv('train.csv')
test = pd.read_csv('test.csv')
Out[3]:
'C:\\Users\\kimsu\\study\\kaggle'
In [7]:
train.head()
Out[7]:
IdMSSubClassMSZoningLotFrontageLotAreaStreetAlleyLotShapeLandContourUtilities...PoolAreaPoolQCFenceMiscFeatureMiscValMoSoldYrSoldSaleTypeSaleConditionSalePrice
0160RL65.08450PaveNaNRegLvlAllPub...0NaNNaNNaN022008WDNormal208500
1220RL80.09600PaveNaNRegLvlAllPub...0NaNNaNNaN052007WDNormal181500
2360RL68.011250PaveNaNIR1LvlAllPub...0NaNNaNNaN092008WDNormal223500
3470RL60.09550PaveNaNIR1LvlAllPub...0NaNNaNNaN022006WDAbnorml140000
4560RL84.014260PaveNaNIR1LvlAllPub...0NaNNaNNaN0122008WDNormal250000

5 rows × 81 columns

In [8]:
test.head()
Out[8]:
IdMSSubClassMSZoningLotFrontageLotAreaStreetAlleyLotShapeLandContourUtilities...ScreenPorchPoolAreaPoolQCFenceMiscFeatureMiscValMoSoldYrSoldSaleTypeSaleCondition
0146120RH80.011622PaveNaNRegLvlAllPub...1200NaNMnPrvNaN062010WDNormal
1146220RL81.014267PaveNaNIR1LvlAllPub...00NaNNaNGar21250062010WDNormal
2146360RL74.013830PaveNaNIR1LvlAllPub...00NaNMnPrvNaN032010WDNormal
3146460RL78.09978PaveNaNIR1LvlAllPub...00NaNNaNNaN062010WDNormal
41465120RL43.05005PaveNaNIR1HLSAllPub...1440NaNNaNNaN012010WDNormal

5 rows × 80 columns

In [23]:
train.describe()
Out[23]:
IdMSSubClassLotFrontageLotAreaOverallQualOverallCondYearBuiltYearRemodAddMasVnrAreaBsmtFinSF1...WoodDeckSFOpenPorchSFEnclosedPorch3SsnPorchScreenPorchPoolAreaMiscValMoSoldYrSoldSalePrice
count1460.0000001460.0000001201.0000001460.0000001460.0000001460.0000001460.0000001460.0000001452.0000001460.000000...1460.0000001460.0000001460.0000001460.0000001460.0000001460.0000001460.0000001460.0000001460.0000001460.000000
mean730.50000056.89726070.04995810516.8280826.0993155.5753421971.2678081984.865753103.685262443.639726...94.24452146.66027421.9541103.40958915.0609592.75890443.4890416.3219182007.815753180921.195890
std421.61000942.30057124.2847529981.2649321.3829971.11279930.20290420.645407181.066207456.098091...125.33879466.25602861.11914929.31733155.75741540.177307496.1230242.7036261.32809579442.502883
min1.00000020.00000021.0000001300.0000001.0000001.0000001872.0000001950.0000000.0000000.000000...0.0000000.0000000.0000000.0000000.0000000.0000000.0000001.0000002006.00000034900.000000
25%365.75000020.00000059.0000007553.5000005.0000005.0000001954.0000001967.0000000.0000000.000000...0.0000000.0000000.0000000.0000000.0000000.0000000.0000005.0000002007.000000129975.000000
50%730.50000050.00000069.0000009478.5000006.0000005.0000001973.0000001994.0000000.000000383.500000...0.00000025.0000000.0000000.0000000.0000000.0000000.0000006.0000002008.000000163000.000000
75%1095.25000070.00000080.00000011601.5000007.0000006.0000002000.0000002004.000000166.000000712.250000...168.00000068.0000000.0000000.0000000.0000000.0000000.0000008.0000002009.000000214000.000000
max1460.000000190.000000313.000000215245.00000010.0000009.0000002010.0000002010.0000001600.0000005644.000000...857.000000547.000000552.000000508.000000480.000000738.00000015500.00000012.0000002010.000000755000.000000

8 rows × 38 columns

EDA

결측값 파악

  • MiscFeature, Fence, PoolQ 요거 3개는 결측값 80% 95% 96% 막 이렇다
In [229]:
def cnt_NA(df):
    colname = df.columns.tolist()
    for i in colname:
        if sum(pd.isnull(df[i])) != 0:
            na = sum(pd.isnull(df[i]))
            print(i + ":" + str(na)+ ", NA_ratio:" + str(na/len(df)))
    print("NA test end")
In [230]:
cnt_NA(train)
LotFrontage:259, NA_ratio:0.177397260274
Alley:1369, NA_ratio:0.937671232877
MasVnrType:8, NA_ratio:0.00547945205479
MasVnrArea:8, NA_ratio:0.00547945205479
BsmtQual:37, NA_ratio:0.0253424657534
BsmtCond:37, NA_ratio:0.0253424657534
BsmtExposure:38, NA_ratio:0.0260273972603
BsmtFinType1:37, NA_ratio:0.0253424657534
BsmtFinType2:38, NA_ratio:0.0260273972603
Electrical:1, NA_ratio:0.000684931506849
FireplaceQu:690, NA_ratio:0.472602739726
GarageType:81, NA_ratio:0.0554794520548
GarageYrBlt:81, NA_ratio:0.0554794520548
GarageFinish:81, NA_ratio:0.0554794520548
GarageQual:81, NA_ratio:0.0554794520548
GarageCond:81, NA_ratio:0.0554794520548
PoolQC:1453, NA_ratio:0.995205479452
Fence:1179, NA_ratio:0.807534246575
MiscFeature:1406, NA_ratio:0.96301369863
NA test end

종속변수 분포

In [42]:
train["SalePrice"].describe()
Out[42]:
count      1460.000000
mean     180921.195890
std       79442.502883
min       34900.000000
25%      129975.000000
50%      163000.000000
75%      214000.000000
max      755000.000000
Name: SalePrice, dtype: float64
In [52]:
%matplotlib inline
import seaborn as sns
sns.distplot(train['SalePrice'])
Out[52]:
<matplotlib.axes._subplots.AxesSubplot at 0x1613d044400>

독립변수 : 연속형, 이산형 분리

In [118]:
cols = train.columns #전체컬럼
num_cols = train._get_numeric_data().columns # 연속형변수 
num_cols = list(num_cols)
In [119]:
cate_cols = list(set(cols) - set(num_cols)) #이산형변수

독립변수 분포 : 연속형 변수

In [84]:
numeric_data = train._get_numeric_data()
numeric_data.hist(bins=50, figsize=(20,15))
Out[84]:
array([[<matplotlib.axes._subplots.AxesSubplot object at 0x00000161643720B8>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016161FF16A0>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016162014DA0>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016165776B00>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016165BA71D0>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016165BA78D0>],
       [<matplotlib.axes._subplots.AxesSubplot object at 0x0000016165C9B6D8>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016165DA4FD0>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016165C77FD0>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016165DE1860>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016165E349E8>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016165EA06A0>],
       [<matplotlib.axes._subplots.AxesSubplot object at 0x0000016165EF3FD0>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016165F2E400>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016165FAF940>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016165FC1E48>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616607D780>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161660CCE48>],
       [<matplotlib.axes._subplots.AxesSubplot object at 0x00000161661485C0>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161661A8860>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016166225518>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016166283E48>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161662C7278>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161663507B8>],
       [<matplotlib.axes._subplots.AxesSubplot object at 0x0000016166363F28>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616641E518>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616646FCC0>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161664EA358>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616654B6D8>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161665C7390>],
       [<matplotlib.axes._subplots.AxesSubplot object at 0x0000016166626CC0>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161666690F0>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161666F4630>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016166705978>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161667BF390>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616681AB38>],
       [<matplotlib.axes._subplots.AxesSubplot object at 0x000001616688D1D0>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161668ED550>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016166969320>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161669C7B38>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016166A02F28>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016166A944A8>]], dtype=object)

독립변수 분포 : 이산형 변수

In [129]:
cate_data = train[cate_cols]
cate_data.head()
Out[129]:
MiscFeatureLandContourExterior2ndLotShapeGarageCondBsmtExposureRoofMatlExterCondKitchenQualLandSlope...RoofStyleLotConfigBsmtCondSaleConditionNeighborhoodBldgTypeBsmtFinType2ElectricalSaleTypeFoundation
0NaNLvlVinylSdRegTANoCompShgTAGdGtl...GableInsideTANormalCollgCr1FamUnfSBrkrWDPConc
1NaNLvlMetalSdRegTAGdCompShgTATAGtl...GableFR2TANormalVeenker1FamUnfSBrkrWDCBlock
2NaNLvlVinylSdIR1TAMnCompShgTAGdGtl...GableInsideTANormalCollgCr1FamUnfSBrkrWDPConc
3NaNLvlWd ShngIR1TANoCompShgTAGdGtl...GableCornerGdAbnormlCrawfor1FamUnfSBrkrWDBrkTil
4NaNLvlVinylSdIR1TAAvCompShgTAGdGtl...GableFR2TANormalNoRidge1FamUnfSBrkrWDPConc

5 rows × 43 columns

In [130]:
cate_data.describe()
Out[130]:
MiscFeatureLandContourExterior2ndLotShapeGarageCondBsmtExposureRoofMatlExterCondKitchenQualLandSlope...RoofStyleLotConfigBsmtCondSaleConditionNeighborhoodBldgTypeBsmtFinType2ElectricalSaleTypeFoundation
count54146014601460137914221460146014601460...1460146014231460146014601422145914601460
unique44164548543...65462556596
topShedLvlVinylSdRegTANoCompShgTATAGtl...GableInsideTANormalNAmes1FamUnfSBrkrWDPConc
freq4913115049251326953143412827351382...11411052131111982251220125613341267647

4 rows × 43 columns

In [181]:
#43개 컬럼
cate_data[cate_cols[0]].value_counts().plot(kind = "bar")
Out[181]:
<matplotlib.axes._subplots.AxesSubplot at 0x1616c3a1320>

독립*종속변수 상관

In [153]:
numeric_data.plot.scatter(x=num_cols[4], y='SalePrice', ylim=(0,800000));
In [182]:
from pandas.tools.plotting import scatter_matrix
scatter_matrix(numeric_data, figsize=(12,8))
c:\users\kimsu\appdata\local\programs\python\python36\lib\site-packages\ipykernel_launcher.py:3: FutureWarning: 'pandas.tools.plotting.scatter_matrix' is deprecated, import 'pandas.plotting.scatter_matrix' instead.
  This is separate from the ipykernel package so we can avoid doing imports until
Out[182]:
array([[<matplotlib.axes._subplots.AxesSubplot object at 0x000001616C3C37B8>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616C52E438>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616C36A550>,
        ...,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616D2E5940>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616D2F8E48>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616D3B66A0>],
       [<matplotlib.axes._subplots.AxesSubplot object at 0x000001616D406E48>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616E4524E0>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616E4B3860>,
        ...,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616EE23DA0>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616EE83550>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616EEECBE0>],
       [<matplotlib.axes._subplots.AxesSubplot object at 0x000001616EF4BF28>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616EFC5908>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616F02D390>,
        ...,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616FD76E80>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616FDE0208>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001616FE52C88>],
       ..., 
       [<matplotlib.axes._subplots.AxesSubplot object at 0x00000161166C3B38>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016116785C50>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016116739710>,
        ...,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016117539C88>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001611756B1D0>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016117615080>],
       [<matplotlib.axes._subplots.AxesSubplot object at 0x00000161175BFB70>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161176DBBA8>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001611773C940>,
        ...,
        <matplotlib.axes._subplots.AxesSubplot object at 0x000001611849E470>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016118448F28>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016118561F98>],
       [<matplotlib.axes._subplots.AxesSubplot object at 0x00000161185C4D30>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016118639400>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161186A0208>,
        ...,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161193F23C8>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x0000016119455160>,
        <matplotlib.axes._subplots.AxesSubplot object at 0x00000161194BE7F0>]], dtype=object)
In [183]:
corr_matrix = numeric_data.corr()
corr_matrix["SalePrice"].sort_values(ascending=False)
Out[183]:
SalePrice        1.000000
OverallQual      0.790982
GrLivArea        0.708624
GarageCars       0.640409
GarageArea       0.623431
TotalBsmtSF      0.613581
1stFlrSF         0.605852
FullBath         0.560664
TotRmsAbvGrd     0.533723
YearBuilt        0.522897
YearRemodAdd     0.507101
GarageYrBlt      0.486362
MasVnrArea       0.477493
Fireplaces       0.466929
BsmtFinSF1       0.386420
LotFrontage      0.351799
WoodDeckSF       0.324413
2ndFlrSF         0.319334
OpenPorchSF      0.315856
HalfBath         0.284108
LotArea          0.263843
BsmtFullBath     0.227122
BsmtUnfSF        0.214479
BedroomAbvGr     0.168213
ScreenPorch      0.111447
PoolArea         0.092404
MoSold           0.046432
3SsnPorch        0.044584
BsmtFinSF2      -0.011378
BsmtHalfBath    -0.016844
MiscVal         -0.021190
Id              -0.021917
LowQualFinSF    -0.025606
YrSold          -0.028923
OverallCond     -0.077856
MSSubClass      -0.084284
EnclosedPorch   -0.128578
KitchenAbvGr    -0.135907
Name: SalePrice, dtype: float64
In [297]:
import seaborn as sns
plt.subplots(figsize=(12,9))
sns.heatmap(corr_matrix, vmax=0.9, square=True)
Out[297]:
(<matplotlib.figure.Figure at 0x1612c545438>,
 <matplotlib.axes._subplots.AxesSubplot at 0x1612d67ae48>)
Out[297]:
<matplotlib.axes._subplots.AxesSubplot at 0x1612d67ae48>

연속형 변수만 가지고 딥러닝 박치기

  • 이상치 제거 없이, 파생변수 생성없이 그냥 박치기
In [440]:
num_data = train._get_numeric_data()
cnt_NA(num_data)
LotFrontage:259, NA_ratio:0.177397260274
MasVnrArea:8, NA_ratio:0.00547945205479
GarageYrBlt:81, NA_ratio:0.0554794520548
NA test end
In [441]:
# 결측값 얼마안돼는 경우 그냥 drop시킴
num_data = num_data.dropna(subset=["MasVnrArea"])
num_data = num_data.dropna(subset=["GarageYrBlt"])
len(num_data)
Out[441]:
1371
In [442]:
# LotFrontage은 중앙값으로 채움
median = num_data["LotFrontage"].median()
num_data["LotFrontage"] = num_data["LotFrontage"].fillna(median)
In [443]:
cnt_NA(num_data)
NA test end
In [444]:
num_data.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 1371 entries, 0 to 1459
Data columns (total 38 columns):
Id               1371 non-null int64
MSSubClass       1371 non-null int64
LotFrontage      1371 non-null float64
LotArea          1371 non-null int64
OverallQual      1371 non-null int64
OverallCond      1371 non-null int64
YearBuilt        1371 non-null int64
YearRemodAdd     1371 non-null int64
MasVnrArea       1371 non-null float64
BsmtFinSF1       1371 non-null int64
BsmtFinSF2       1371 non-null int64
BsmtUnfSF        1371 non-null int64
TotalBsmtSF      1371 non-null int64
1stFlrSF         1371 non-null int64
2ndFlrSF         1371 non-null int64
LowQualFinSF     1371 non-null int64
GrLivArea        1371 non-null int64
BsmtFullBath     1371 non-null int64
BsmtHalfBath     1371 non-null int64
FullBath         1371 non-null int64
HalfBath         1371 non-null int64
BedroomAbvGr     1371 non-null int64
KitchenAbvGr     1371 non-null int64
TotRmsAbvGrd     1371 non-null int64
Fireplaces       1371 non-null int64
GarageYrBlt      1371 non-null float64
GarageCars       1371 non-null int64
GarageArea       1371 non-null int64
WoodDeckSF       1371 non-null int64
OpenPorchSF      1371 non-null int64
EnclosedPorch    1371 non-null int64
3SsnPorch        1371 non-null int64
ScreenPorch      1371 non-null int64
PoolArea         1371 non-null int64
MiscVal          1371 non-null int64
MoSold           1371 non-null int64
YrSold           1371 non-null int64
SalePrice        1371 non-null int64
dtypes: float64(3), int64(35)
memory usage: 417.7 KB
In [461]:
from sklearn.model_selection import train_test_split
X = num_data.iloc[:, :-1].values
y = num_data.iloc[:, -1].values

X_train, X_test, Y_train, Y_test = train_test_split(X, y, test_size=0.3, random_state=1)
print(len(X_train), len(X_test), len(Y_train), len(Y_test))
959 412 959 412
In [462]:
from sklearn.preprocessing import StandardScaler

sc = StandardScaler()
X_train = sc.fit_transform(X_train)
X_test = sc.transform(X_test)
In [463]:
from keras.models import Sequential
from keras.layers import Dense

model = Sequential([
    Dense(80, input_dim=37, kernel_initializer='normal', activation='selu'),
    Dense(40, kernel_initializer='normal', activation='selu'),
    Dense(1, kernel_initializer='normal'),
])
In [464]:
model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_64 (Dense)             (None, 80)                3040      
_________________________________________________________________
dense_65 (Dense)             (None, 40)                3240      
_________________________________________________________________
dense_66 (Dense)             (None, 1)                 41        
=================================================================
Total params: 6,321
Trainable params: 6,321
Non-trainable params: 0
_________________________________________________________________
In [465]:
model.compile(loss='mse', optimizer='adam')
In [466]:
history = model.fit(X_train, Y_train, batch_size=3, epochs=100)
Epoch 1/100
959/959 [==============================] - 1s - loss: 40000984577.6017     
Epoch 2/100
959/959 [==============================] - 0s - loss: 39431871233.8686     
Epoch 3/100
959/959 [==============================] - 0s - loss: 37741441195.9124     
Epoch 4/100
959/959 [==============================] - 0s - loss: 34576398252.7132     
Epoch 5/100
959/959 [==============================] - 0s - loss: 30110962902.6236     - ETA: 0s - loss: 31568219
Epoch 6/100
959/959 [==============================] - 0s - loss: 24739139718.5401     
Epoch 7/100
959/959 [==============================] - 1s - loss: 19436839981.9145     
Epoch 8/100
959/959 [==============================] - 1s - loss: 15216211496.5756     
Epoch 9/100
959/959 [==============================] - 1s - loss: 12371054890.4442     
Epoch 10/100
959/959 [==============================] - 0s - loss: 10642166918.0063     
Epoch 11/100
959/959 [==============================] - 0s - loss: 9566642664.2753     
Epoch 12/100
959/959 [==============================] - 0s - loss: 8850078192.2169     
Epoch 13/100
959/959 [==============================] - 0s - loss: 8261679417.0761     
Epoch 14/100
959/959 [==============================] - 0s - loss: 7760315924.6048     
Epoch 15/100
959/959 [==============================] - 0s - loss: 7290377899.9416     
Epoch 16/100
959/959 [==============================] - 0s - loss: 6860056289.4932     
Epoch 17/100
959/959 [==============================] - 0s - loss: 6467558406.8198     
Epoch 18/100
959/959 [==============================] - 0s - loss: 6096553163.8373     
Epoch 19/100
959/959 [==============================] - 1s - loss: 5772576588.2377     
Epoch 20/100
959/959 [==============================] - 1s - loss: 5446271399.6663     
Epoch 21/100
959/959 [==============================] - 1s - loss: 5140446851.2993     
Epoch 22/100
959/959 [==============================] - 1s - loss: 4862034403.2284     
Epoch 23/100
959/959 [==============================] - 1s - loss: 4596413352.6757     
Epoch 24/100
959/959 [==============================] - 0s - loss: 4341499413.1721     
Epoch 25/100
959/959 [==============================] - 1s - loss: 4100327355.5954     
Epoch 26/100
959/959 [==============================] - 0s - loss: 3872873334.4400     
Epoch 27/100
959/959 [==============================] - 0s - loss: 3659647701.4390     
Epoch 28/100
959/959 [==============================] - 0s - loss: 3462472097.4140     
Epoch 29/100
959/959 [==============================] - 0s - loss: 3266002853.8895     
Epoch 30/100
959/959 [==============================] - 0s - loss: 3089024802.3024     
Epoch 31/100
959/959 [==============================] - 0s - loss: 2905657995.9458     
Epoch 32/100
959/959 [==============================] - 0s - loss: 2756079945.6955     
Epoch 33/100
959/959 [==============================] - 0s - loss: 2628415078.3942     
Epoch 34/100
959/959 [==============================] - 0s - loss: 2474901018.5954     
Epoch 35/100
959/959 [==============================] - 0s - loss: 2358788066.8363     
Epoch 36/100
959/959 [==============================] - 0s - loss: 2247607000.6100     
Epoch 37/100
959/959 [==============================] - 0s - loss: 2151658203.9082     
Epoch 38/100
959/959 [==============================] - 0s - loss: 2073183556.3399     
Epoch 39/100
959/959 [==============================] - 0s - loss: 1972095149.8668     
Epoch 40/100
959/959 [==============================] - 0s - loss: 1900507021.2294     
Epoch 41/100
959/959 [==============================] - 0s - loss: 1834325415.2086     
Epoch 42/100
959/959 [==============================] - 0s - loss: 1767036142.7112     
Epoch 43/100
959/959 [==============================] - 0s - loss: 1711834124.6155     
Epoch 44/100
959/959 [==============================] - 0s - loss: 1656645492.3358     
Epoch 45/100
959/959 [==============================] - 0s - loss: 1608968671.9187     
Epoch 46/100
959/959 [==============================] - 0s - loss: 1576162073.8957     
Epoch 47/100
959/959 [==============================] - 0s - loss: 1533543331.9531     
Epoch 48/100
959/959 [==============================] - 0s - loss: 1487332673.7581     
Epoch 49/100
959/959 [==============================] - 0s - loss: 1457560469.2179     
Epoch 50/100
959/959 [==============================] - 0s - loss: 1430252773.4390     
Epoch 51/100
959/959 [==============================] - 0s - loss: 1408137910.3462     
Epoch 52/100
959/959 [==============================] - 0s - loss: 1385694610.7733     
Epoch 53/100
959/959 [==============================] - 0s - loss: 1368295174.9844     
Epoch 54/100
959/959 [==============================] - 0s - loss: 1342356836.0000     
Epoch 55/100
959/959 [==============================] - 0s - loss: 1332096389.1116     
Epoch 56/100
959/959 [==============================] - 0s - loss: 1324161421.9760     
Epoch 57/100
959/959 [==============================] - 0s - loss: 1307413379.9844     
Epoch 58/100
959/959 [==============================] - 0s - loss: 1282298990.4307     
Epoch 59/100
959/959 [==============================] - 0s - loss: 1244785457.7362     
Epoch 60/100
959/959 [==============================] - 0s - loss: 1219800306.3201     
Epoch 61/100
959/959 [==============================] - 0s - loss: 1208681097.0777     
Epoch 62/100
959/959 [==============================] - 0s - loss: 1188883517.2054     
Epoch 63/100
959/959 [==============================] - 0s - loss: 1183329739.5673     
Epoch 64/100
959/959 [==============================] - 0s - loss: 1173004920.3712     
Epoch 65/100
959/959 [==============================] - 0s - loss: 1163094190.1950     
Epoch 66/100
959/959 [==============================] - 0s - loss: 1159325706.5657     
Epoch 67/100
959/959 [==============================] - 0s - loss: 1108049648.1784     
Epoch 68/100
959/959 [==============================] - 0s - loss: 1052574600.7896     
Epoch 69/100
959/959 [==============================] - 0s - loss: 1031373414.3149     
Epoch 70/100
959/959 [==============================] - 0s - loss: 1017460788.5645     
Epoch 71/100
959/959 [==============================] - 0s - loss: 1007245949.9078     
Epoch 72/100
959/959 [==============================] - 1s - loss: 996674039.2294      
Epoch 73/100
959/959 [==============================] - 1s - loss: 987133165.3639     
Epoch 74/100
959/959 [==============================] - 1s - loss: 979305269.4580     
Epoch 75/100
959/959 [==============================] - 1s - loss: 979638796.2010     
Epoch 76/100
959/959 [==============================] - 1s - loss: 963912718.7800     
Epoch 77/100
959/959 [==============================] - ETA: 0s - loss: 792528490.532 - 1s - loss: 953288018.8498     
Epoch 78/100
959/959 [==============================] - 1s - loss: 945886759.2221     
Epoch 79/100
959/959 [==============================] - 1s - loss: 940356549.7080     
Epoch 80/100
959/959 [==============================] - 1s - loss: 917282123.0042     
Epoch 81/100
959/959 [==============================] - 1s - loss: 930658300.4781     
Epoch 82/100
959/959 [==============================] - 1s - loss: 914446756.6176     
Epoch 83/100
959/959 [==============================] - 1s - loss: 905853807.9202     
Epoch 84/100
959/959 [==============================] - 1s - loss: 903208115.1783     
Epoch 85/100
959/959 [==============================] - 1s - loss: 886749517.0907     
Epoch 86/100
959/959 [==============================] - 1s - loss: 881855952.4832     
Epoch 87/100
959/959 [==============================] - 1s - loss: 895768624.6191     
Epoch 88/100
959/959 [==============================] - 1s - loss: 861116552.0433     
Epoch 89/100
959/959 [==============================] - 1s - loss: 864203493.6423     
Epoch 90/100
959/959 [==============================] - 1s - loss: 851303434.3405     
Epoch 91/100
959/959 [==============================] - 1s - loss: 843307099.6296     
Epoch 92/100
959/959 [==============================] - 1s - loss: 835773519.2153     
Epoch 93/100
959/959 [==============================] - 1s - loss: 829187256.0574     
Epoch 94/100
959/959 [==============================] - 0s - loss: 817952386.8678     
Epoch 95/100
959/959 [==============================] - 1s - loss: 816373345.8843     
Epoch 96/100
959/959 [==============================] - 1s - loss: 805830598.1728     
Epoch 97/100
959/959 [==============================] - 1s - loss: 800067241.9505     
Epoch 98/100
959/959 [==============================] - 1s - loss: 785197628.4849     
Epoch 99/100
959/959 [==============================] - 1s - loss: 777770800.6911     
Epoch 100/100
959/959 [==============================] - 1s - loss: 770006605.0193     
In [470]:
plt.plot(history.history["loss"])
plt.title("Loss")
plt.show()
Out[470]:
[<matplotlib.lines.Line2D at 0x161856e9780>]
Out[470]:
<matplotlib.text.Text at 0x161856c0cf8>
In [471]:
score = model.evaluate(X_test, Y_test, verbose=0)
print(model.metrics_names)
print(score)
['loss']
750662322.796
In [472]:
predictions = model.predict(X_test)

from sklearn.metrics import mean_absolute_error
print("Mean Absolute Error : " + str(mean_absolute_error(predictions, Y_test)))
Mean Absolute Error : 19267.1467688


Archives

04-30 08:24

Contact Us

Address
경기도 수원시 영통구 원천동 산5번지 아주대학교 다산관 429호

E-mail
textminings@gmail.com

Phone
031-219-2910

Tags

Calendar

«   2024/04   »
1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30
Copyright © All Rights Reserved
Designed by CMSFactory.NET