- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
按照我上一篇文章中的建议,我使用 lib KERAS 重写了用于进行时间序列分析的脚本,但在模型中获得了以下输出。
在循环网络中,输入形状应该类似于(批量大小、时间步长、输入特征)。
输出
Traceback (most recent call last):
File "rnrs.py", line 114, in <module>
model = train_model(get_model(), X_train, Y_train, (X_dev, Y_dev), [plot_losses])
File "rnrs.py", line 111, in train_model
model.fit(X_train, Y_train, epochs=200, batch_size=1024, validation_data=validation, callbacks=callbacks, shuffle=False)
File "C:\Users\luis\AppData\Local\Programs\Python\Python35\lib\site-packages\keras\engine\training.py", line 1213, in fit
self._make_train_function()
File "C:\Users\luis\AppData\Local\Programs\Python\Python35\lib\site-packages\keras\engine\training.py", line 316, in _make_train_function
loss=self.total_loss)
File "C:\Users\luis\AppData\Local\Programs\Python\Python35\lib\site-packages\keras\legacy\interfaces.py", line 91, in wrapper
return func(*args, **kwargs)
File "C:\Users\luis\AppData\Local\Programs\Python\Python35\lib\site-packages\keras\backend\tensorflow_backend.py", line 75, in symbolic_fn_wrapper
return func(*args, **kwargs)
File "C:\Users\luis\AppData\Local\Programs\Python\Python35\lib\site-packages\keras\optimizers.py", line 543, in get_updates
p_t = p - lr_t * m_t / (K.sqrt(v_t) + self.epsilon)
File "C:\Users\luis\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow_core\python\ops\math_ops.py", line 903, in binary_op_wrapper
y, dtype_hint=x.dtype.base_dtype, name="y")
File "C:\Users\luis\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow_core\python\framework\ops.py", line 1242, in convert_to_tensor_v2
as_ref=False)
File "C:\Users\luis\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow_core\python\framework\ops.py", line 1296, in internal_convert_to_tensor
ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref)
File "C:\Users\luis\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow_core\python\framework\constant_op.py", line 286, in _constant_tensor_conversion_function
return constant(v, dtype=dtype, name=name)
File "C:\Users\luis\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow_core\python\framework\constant_op.py", line 227, in constant
allow_broadcast=True)
File "C:\Users\luis\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow_core\python\framework\constant_op.py", line 265, in _constant_impl
allow_broadcast=allow_broadcast))
File "C:\Users\luis\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow_core\python\framework\tensor_util.py", line 437, in make_tensor_proto
raise ValueError("None values not supported.")
ValueError: None values not supported.
脚本
import pandas as pd
def load_dataset():
ds = pd.read_csv('hour.csv')
ds['dteday'] = pd.to_datetime(ds['dteday'])
return ds
def one_hot_encoding(df, field):
one_hot_encoded = pd.get_dummies(df[field])
return pd.concat([df.drop(field, axis=1), one_hot_encoded], axis=1)
def preprocess_dataset(df):
df_reduced = df[['dteday', 'cnt', 'season','yr', 'mnth','hr', 'holiday', 'weekday', 'workingday', 'weathersit', 'temp', 'atemp', 'hum', 'windspeed']]
df_reduced = one_hot_encoding(df_reduced, 'season')
df_reduced = one_hot_encoding(df_reduced, 'mnth')
df_reduced = one_hot_encoding(df_reduced, 'hr')
df_reduced = one_hot_encoding(df_reduced, 'weekday')
df_reduced = one_hot_encoding(df_reduced, 'weathersit')
return df_reduced
dataset = load_dataset()
dataset = preprocess_dataset(dataset)
from datetime import datetime
def filter_by_date(ds, start_date, end_date):
start_date_parsed = datetime.strptime(start_date, "%Y-%m-%d")
start_end_parsed = datetime.strptime(end_date, "%Y-%m-%d")
return ds[(ds['dteday'] >= start_date_parsed) & (ds['dteday'] <= start_end_parsed)]
train = filter_by_date(dataset, '2011-01-01', '2012-10-31')
dev = filter_by_date(dataset, '2012-11-01', '2012-11-30')
val = filter_by_date(dataset, '2012-12-01', '2012-12-31')
import numpy as np
def reshape_dataset(ds):
Y = ds['cnt'].values
ds_values = ds.drop(['dteday', 'cnt'], axis=1).values
X = np.reshape(ds_values, (ds_values.shape[0], 1, ds_values.shape[1]))
return X, Y
X_train, Y_train = reshape_dataset(train)
X_dev, Y_dev = reshape_dataset(dev)
X_val, Y_val = reshape_dataset(val)
import keras
from matplotlib import pyplot as plt
from IPython.display import clear_output
class PlotLosses(keras.callbacks.Callback):
def on_train_begin(self, logs={}):
self.i = 0
self.x = []
self.losses = []
self.val_losses = []
self.fig = plt.figure()
self.logs = []
def on_epoch_end(self, epoch, logs={}):
self.logs.append(logs)
self.x.append(self.i)
self.losses.append(logs.get('loss'))
self.val_losses.append(logs.get('val_loss'))
self.i += 1
clear_output(wait=True)
plt.plot(self.x, self.losses, label="loss")
plt.plot(self.x, self.val_losses, label="val_loss")
plt.legend()
plt.show()
plot_losses = PlotLosses()
from keras.models import Model
from keras.layers import Input, Dense, LSTM, Dropout
def get_model():
input = Input(shape=(1, 58))
x = LSTM(200)(input)
x = Dropout(.5)(x)
activation = Dense(1, activation='linear')(x)
model = Model(inputs=input, outputs=activation)
optimizer = keras.optimizers.Adam(lr=0.01,
beta_1=0.9,
beta_2=0.999,
epsilon=None,
decay=0.001,
amsgrad=False)
model.compile(loss='mean_absolute_error', optimizer=optimizer)
model.summary()
return model
get_model()
def train_model(model, X_train, Y_train, validation, callbacks):
model.fit(X_train, Y_train, epochs=200, batch_size=1024, validation_data=validation, callbacks=callbacks, shuffle=False)
return model
model = train_model(get_model(), X_train, Y_train, (X_dev, Y_dev), [plot_losses])
数据集: Bike sharing dataset
期望退出
最佳答案
我在 Google Colab 中对您的脚本做了轻微的修改,直接从网络加载 zip 并对其进行处理(代码包含在下面),并且我没有收到任何错误。不完全确定有什么不同,但这个版本可能有用 - 也许没有从本地 csv 正确读取拟合过程的输入数据 - 我希望这有帮助:
# Source for download_extract_zip:
# https://techoverflow.net/2018/01/16/downloading-reading-a-zip-file-in-memory-using-python/
from zipfile import ZipFile
import requests
import io
import zipfile
def download_extract_zip(url):
"""
Download a ZIP file and extract its contents in memory
yields (filename, file-like object) pairs
"""
response = requests.get(url)
with zipfile.ZipFile(io.BytesIO(response.content)) as thezip:
for zipinfo in thezip.infolist():
with thezip.open(zipinfo) as thefile:
yield zipinfo.filename, thefile
import pandas as pd
def load_dataset():
ds=''
raw_dataset = 'https://archive.ics.uci.edu/ml/machine-learning-databases/00275/Bike-Sharing-Dataset.zip'
for (iFilename, iFile) in download_extract_zip(raw_dataset):
if iFilename == 'hour.csv':
ds = pd.read_csv(iFile)
ds['dteday'] = pd.to_datetime(ds['dteday'])
return ds
def one_hot_encoding(df, field):
one_hot_encoded = pd.get_dummies(df[field])
return pd.concat([df.drop(field, axis=1), one_hot_encoded], axis=1)
def preprocess_dataset(df):
df_reduced = df[['dteday', 'cnt', 'season','yr', 'mnth','hr', 'holiday', 'weekday', 'workingday', 'weathersit', 'temp', 'atemp', 'hum', 'windspeed']]
df_reduced = one_hot_encoding(df_reduced, 'season')
df_reduced = one_hot_encoding(df_reduced, 'mnth')
df_reduced = one_hot_encoding(df_reduced, 'hr')
df_reduced = one_hot_encoding(df_reduced, 'weekday')
df_reduced = one_hot_encoding(df_reduced, 'weathersit')
return df_reduced
dataset = load_dataset()
dataset = preprocess_dataset(dataset)
from datetime import datetime
def filter_by_date(ds, start_date, end_date):
start_date_parsed = datetime.strptime(start_date, "%Y-%m-%d")
start_end_parsed = datetime.strptime(end_date, "%Y-%m-%d")
return ds[(ds['dteday'] >= start_date_parsed) & (ds['dteday'] <= start_end_parsed)]
train = filter_by_date(dataset, '2011-01-01', '2012-10-31')
dev = filter_by_date(dataset, '2012-11-01', '2012-11-30')
val = filter_by_date(dataset, '2012-12-01', '2012-12-31')
import numpy as np
def reshape_dataset(ds):
Y = ds['cnt'].values
ds_values = ds.drop(['dteday', 'cnt'], axis=1).values
X = np.reshape(ds_values, (ds_values.shape[0], 1, ds_values.shape[1]))
return X, Y
X_train, Y_train = reshape_dataset(train)
X_dev, Y_dev = reshape_dataset(dev)
X_val, Y_val = reshape_dataset(val)
import keras
from matplotlib import pyplot as plt
from IPython.display import clear_output
class PlotLosses(keras.callbacks.Callback):
def on_train_begin(self, logs={}):
self.i = 0
self.x = []
self.losses = []
self.val_losses = []
self.fig = plt.figure()
self.logs = []
def on_epoch_end(self, epoch, logs={}):
self.logs.append(logs)
self.x.append(self.i)
self.losses.append(logs.get('loss'))
self.val_losses.append(logs.get('val_loss'))
self.i += 1
clear_output(wait=True)
plt.plot(self.x, self.losses, label="loss")
plt.plot(self.x, self.val_losses, label="val_loss")
plt.legend()
plt.show()
plot_losses = PlotLosses()
from keras.models import Model
from keras.layers import Input, Dense, LSTM, Dropout
def get_model():
input = Input(shape=(1, 58))
x = LSTM(200)(input)
x = Dropout(.5)(x)
activation = Dense(1, activation='linear')(x)
model = Model(inputs=input, outputs=activation)
optimizer = keras.optimizers.Adam(lr=0.01,
beta_1=0.9,
beta_2=0.999,
epsilon=None,
decay=0.001,
amsgrad=False)
model.compile(loss='mean_absolute_error', optimizer=optimizer)
model.summary()
return model
get_model()
def train_model(model, X_train, Y_train, validation, callbacks):
model.fit(X_train, Y_train, epochs=200, batch_size=1024, validation_data=validation, callbacks=callbacks, shuffle=False)
return model
model = train_model(get_model(), X_train, Y_train, (X_dev, Y_dev), [plot_losses])
关于python - 使用 Keras 进行时间序列预测 - 模型值错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58698942/
我喜欢 smartcase,也喜欢 * 和 # 搜索命令。但我更希望 * 和 # 搜索命令区分大小写,而/和 ?搜索命令遵循 smartcase 启发式。 是否有隐藏在某个地方我还没有找到的设置?我宁
关闭。这个问题是off-topic .它目前不接受答案。 想改进这个问题? Update the question所以它是on-topic对于堆栈溢出。 10年前关闭。 Improve this qu
从以下网站,我找到了执行java AD身份验证的代码。 http://java2db.com/jndi-ldap-programming/solution-to-sslhandshakeexcepti
似乎 melt 会使用 id 列和堆叠的测量变量 reshape 您的数据框,然后通过转换让您执行聚合。 ddply,从 plyr 包看起来非常相似..你给它一个数据框,几个用于分组的列变量和一个聚合
我的问题是关于 memcached。 Facebook 使用 memcached 作为其结构化数据的缓存,以减少用户的延迟。他们在 Linux 上使用 UDP 优化了 memcached 的性能。 h
在 Camel route ,我正在使用 exec 组件通过 grep 进行 curl ,但使用 ${HOSTNAME} 的 grep 无法正常工作,下面是我的 Camel 路线。请在这方面寻求帮助。
我正在尝试执行相当复杂的查询,在其中我可以排除与特定条件集匹配的项目。这是一个 super 简化的模型来解释我的困境: class Thing(models.Model) user = mod
我正在尝试执行相当复杂的查询,我可以在其中排除符合特定条件集的项目。这里有一个 super 简化的模型来解释我的困境: class Thing(models.Model) user = mod
我发现了很多嵌入/内容项目的旧方法,并且我遵循了在这里找到的最新方法(我假设):https://blog.angular-university.io/angular-ng-content/ 我正在尝试
我正在寻找如何使用 fastify-nextjs 启动 fastify-cli 的建议 我曾尝试将代码简单地添加到建议的位置,但它不起作用。 'use strict' const path = req
我正在尝试将振幅 js 与 React 和 Gatsby 集成。做 gatsby developer 时一切看起来都不错,因为它发生在浏览器中,但是当我尝试 gatsby build 时,我收到以下错
我试图避免过度执行空值检查,但同时我想在需要使代码健壮的时候进行空值检查。但有时我觉得它开始变得如此防御,因为我没有实现 API。然后我避免了一些空检查,但是当我开始单元测试时,它开始总是等待运行时异
尝试进行包含一些 NOT 的 Kibana 搜索,但获得包含 NOT 的结果,因此猜测我的语法不正确: "chocolate" AND "milk" AND NOT "cow" AND NOT "tr
我正在使用开源代码共享包在 iOS 中进行 facebook 集成,但收到错误“FT_Load_Glyph failed: glyph 65535: error 6”。我在另一台 mac 机器上尝试了
我正在尝试估计一个标准的 tobit 模型,该模型被审查为零。 变量是 因变量 : 幸福 自变量 : 城市(芝加哥,纽约), 性别(男,女), 就业(0=失业,1=就业), 工作类型(失业,蓝色,白色
我有一个像这样的项目布局 样本/ 一种/ 源/ 主要的/ java / java 资源/ .jpg 乙/ 源/ 主要的/ java / B.java 资源/ B.jpg 构建.gradle 设置.gr
如何循环遍历数组中的多个属性以及如何使用map函数将数组中的多个属性显示到网页 import React, { Component } from 'react'; import './App.css'
我有一个 JavaScript 函数,它进行 AJAX 调用以返回一些数据,该调用是在选择列表更改事件上触发的。 我尝试了多种方法来在等待时显示加载程序,因为它当前暂停了选择列表,从客户的 Angul
可能以前问过,但找不到。 我正在用以下形式写很多语句: if (bar.getFoo() != null) { this.foo = bar.getFoo(); } 我想到了三元运算符,但我认
我有一个表单,在将其发送到 PHP 之前我正在执行一些验证 JavaScript,验证后的 JavaScript 函数会发布用户在 中输入的文本。页面底部的标签;然而,此消息显示短暂,然后消失...
我是一名优秀的程序员,十分优秀!