gpt4 book ai didi

Mysql 已经消失了 media temple

转载 作者:行者123 更新时间:2023-11-29 05:27:35 26 4
gpt4 key购买 nike

真的很想得到一些帮助。

在上传 75mb 的大文件时,我不断收到错误 Mysql has gone away。任何低于 20mb 的内容都可以上传。

所以我已经在此处的其他帖子中调查了此​​错误。我在媒体殿堂,但不幸的是,他们说这超出了他们的范围。

我已经在/etc/my.cnf 编辑了这个文件 my.cnf

[client]
port = 3306
socket = /var/lib/mysql/mysql.sock

[mysqld_safe]
log-error=/var/log/mysqld.log
pid-file=/var/run/mysqld/mysqld.pid

innodb_buffer_pool_size=2M
innodb_additional_mem_pool_size=500K
innodb_log_buffer_size=500K
innodb_thread_concurrency=2

[mysqld]
local-infile=0
datadir=/var/lib/mysql
user=mysql
symbolic-links=0

max_connections = 150
wait_timeout = 600
query-cache-type = 1
query-cache-size = 16M
query_cache_limit = 2M
thread_cache_size = 16
tmp_table_size = 32M
max_heap_table_size = 32M
join_buffer_size = 2M
table_open_cache = 128

port = 3306
socket = /var/lib/mysql/mysql.sock
skip-external-locking
key_buffer_size = 16M
max_allowed_packet = 1M
sort_buffer_size = 512K
net_buffer_length = 8K
read_buffer_size = 256K
read_rnd_buffer_size = 512K
myisam_sort_buffer_size = 8M

innodb_buffer_pool_size = 16M
innodb_additional_mem_pool_size = 2M
innodb_log_buffer_size = 8M

[mysqldump]
quick
max_allowed_packet = 16M

[mysql]
no-auto-rehash

[myisamchk]
key_buffer_size = 20M
sort_buffer_size = 20M
read_buffer = 2M
write_buffer = 2M

所以我已经将等待超时部分增加到 600,这应该足够了?

我的 php.ini 文件设置如下。

[PHP]
soap.wsdl_cache_limit = 5
include_path = ".:"
cli_server.color = On
mysql.allow_persistent = On
mysqli.max_persistent = -1
session.bug_compat_42 = Off
mysql.connect_timeout = -1
session.use_only_cookies = 1
register_argc_argv = Off
mssql.min_error_severity = 10
open_basedir = "/var/www/vhosts/s3bubble.com/:/tmp/"
session.name = PHPSESSID
mysqlnd.collect_statistics = On
session.hash_function = 0
session.gc_probability = 1
log_errors_max_len = 1024
mssql.secure_connection = Off
pgsql.max_links = -1
variables_order = "GPCS"
ldap.max_links = -1
sybct.allow_persistent = On
max_input_time = 600
odbc.max_links = -1
session.save_handler = files
mysqli.cache_size = 2000
pgsql.auto_reset_persistent = Off
error_reporting = E_ALL & ~E_DEPRECATED & ~E_STRICT
auto_prepend_file =
sendmail_path = /usr/sbin/sendmail -t -i
sybct.min_client_severity = 10
pgsql.max_persistent = -1
auto_globals_jit = On
soap.wsdl_cache_ttl = 86400
allow_url_fopen = On
zend.enable_gc = On
mysqli.allow_persistent = On
tidy.clean_output = Off
display_startup_errors = Off
user_dir =
session.cookie_lifetime = 0
mysqli.max_links = -1
default_socket_timeout = 900
session.serialize_handler = php
session.hash_bits_per_character = 5
unserialize_callback_func =
pdo_mysql.cache_size = 2000
default_mimetype = "text/html"
session.cache_expire = 180
max_execution_time = 600
mail.add_x_header = On
upload_max_filesize = 1G
ibase.max_links = -1
safe_mode = off
zlib.output_compression = Off
ignore_repeated_errors = Off
odbc.max_persistent = -1
mssql.compatability_mode = Off
file_uploads = On

所以我已经提高了超时任何帮助这让我发疯任何任何建议

这是脚本;

public function Uploader()
{
header("Expires: Mon, 26 Jul 1997 05:00:00 GMT");
header("Last-Modified: " . gmdate("D, d M Y H:i:s") . " GMT");
header("Cache-Control: no-store, no-cache, must-revalidate");
header("Cache-Control: post-check=0, pre-check=0", false);
header("Pragma: no-cache");


$targetDir = $_SERVER['DOCUMENT_ROOT'] . '/uploads/' . $this->session->userdata('account_id') . '/folder/' . $_REQUEST['playlist_id'];
if (!file_exists($targetDir)) {
mkdir($targetDir, 0777);
}
// Get parameters
$chunk = isset($_REQUEST["chunk"]) ? intval($_REQUEST["chunk"]) : 0;
$chunks = isset($_REQUEST["chunks"]) ? intval($_REQUEST["chunks"]) : 0;
$fileName = isset($_REQUEST["name"]) ? $_REQUEST["name"] : '';

// Clean the fileName for security reasons
$fileName = preg_replace('/[^\w\._]+/', '_', $fileName);

// Make sure the fileName is unique but only if chunking is disabled
if ($chunks < 2 && file_exists($targetDir . DIRECTORY_SEPARATOR . $fileName)) {
$ext = strrpos($fileName, '.');
$fileName_a = substr($fileName, 0, $ext);
$fileName_b = substr($fileName, $ext);

$count = 1;
while (file_exists($targetDir . DIRECTORY_SEPARATOR . $fileName_a . '_' . $count . $fileName_b)) {
$count++;
}

$fileName = $fileName_a . '_' . $count . $fileName_b;
}

$filePath = $targetDir . DIRECTORY_SEPARATOR . $fileName;

// Create target dir
if (!file_exists($targetDir)) {
@mkdir($targetDir);
}

// Look for the content type header
if (isset($_SERVER["HTTP_CONTENT_TYPE"])) {
$contentType = $_SERVER["HTTP_CONTENT_TYPE"];
}

if (isset($_SERVER["CONTENT_TYPE"])) {
$contentType = $_SERVER["CONTENT_TYPE"];
}

// Handle non multipart uploads older WebKit versions did not support multipart in HTML5
if (strpos($contentType, "multipart") !== false) {
if (isset($_FILES['file']['tmp_name']) && is_uploaded_file($_FILES['file']['tmp_name'])) {
// Open temp file
$out = fopen("{$filePath}.part", $chunk == 0 ? "wb" : "ab");
if ($out) {
// Read binary input stream and append it to temp file
$in = fopen($_FILES['file']['tmp_name'], "rb");

if ($in) {
while ($buff = fread($in, 4096)) {
fwrite($out, $buff);
}
} else {
die('{"jsonrpc" : "2.0", "error" : {"code": 101, "message": "Failed to open input stream."}, "id" : "id"}');
}
fclose($in);
fclose($out);
@unlink($_FILES['file']['tmp_name']);
} else {
die('{"jsonrpc" : "2.0", "error" : {"code": 102, "message": "Failed to open output stream."}, "id" : "id"}');
}
} else {
die('{"jsonrpc" : "2.0", "error" : {"code": 103, "message": "Failed to move uploaded file."}, "id" : "id"}');
}
} else {
// Open temp file
$out = fopen("{$filePath}.part", $chunk == 0 ? "wb" : "ab");
if ($out) {
// Read binary input stream and append it to temp file
$in = fopen("php://input", "rb");

if ($in) {
while ($buff = fread($in, 4096)) {
fwrite($out, $buff);
}
} else {
die('{"jsonrpc" : "2.0", "error" : {"code": 101, "message": "Failed to open input stream."}, "id" : "id"}');
}

fclose($in);
fclose($out);
} else {
die('{"jsonrpc" : "2.0", "error" : {"code": 102, "message": "Failed to open output stream."}, "id" : "id"}');
}
}

// Check if file has been uploaded
if (!$chunks || $chunk == $chunks - 1) {

$response = shell_exec("cd {$targetDir}/ && runs some script here 2>&1");

$data = array(
'filename' => $fileName,

);

if ($response) {
//Add values to database
$this->uploader_model->addFileData($data);

}

}
}

最佳答案

请增加wait_timeout的时间

Two possible reasons may be:-

  1. Server timed out and closed the connection. How to fix: check that wait_timeout variable in your mysqld’s my.cnf/my.ini configuration file is large enough.
  2. Server dropped an incorrect or too large packet. If mysqld gets a packet that is too large or incorrect, it assumes that something has gone wrong with the client and closes the connection. You can increase the maximal packet size limit by increasing the value of max_allowed_packet in my.cnf/my.ini file.

关于Mysql 已经消失了 media temple,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/18315130/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com