gpt4 book ai didi

linux - 在 bash 变量中保存 URL 会导致 curl 失败

转载 作者:太空宇宙 更新时间:2023-11-04 09:46:01 24 4
gpt4 key购买 nike

在 bash 脚本中,我将来自先前命令的 URL 存储在 bash 变量 $DESTINATION_URL 中。我想使用此变量运行 curl 命令。

如果我使用 $DESTINATION_URL 变量,curl 命令会失败。

如果我对 URL 本身尝试相同的 curl 命令,它会正常工作。 & 似乎导致了问题,但我不明白为什么。

示例如下:

ha@hadoop-fullslot1:~$ echo $DESTINATION_URL
http://hadoop-fullslot1:50075/webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true


ha@hadoop-fullslot1:~$ curl -v -s -i -X PUT -T $SOURCE "$DESTINATION_URL"
* About to connect() to hadoop-fullslot1 port 50075 (#0)
* Trying 10.1.3.39... connected
HTTP/1.1bhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3
> Host: hadoop-fullslot1:50075
> Accept: */*
> Content-Length: 1907377
> Expect: 100-continue
>
* Empty reply from server
* Connection #0 to host hadoop-fullslot1 left intact
* Closing connection #0


ha@hadoop-fullslot1:~$ curl -v -s -i -X PUT -T $SOURCE "http://hadoop-fullslot1:50075/webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true"
* About to connect() to hadoop-fullslot1 port 50075 (#0)
* Trying 10.1.3.39... connected
> PUT /webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true HTTP/1.1
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3
> Host: hadoop-fullslot1:50075
> Accept: */*
> Content-Length: 1907377
> Expect: 100-continue
>
< HTTP/1.1 100 Continue
HTTP/1.1 100 Continue

* We are completely uploaded and fine
< HTTP/1.1 201 Created
HTTP/1.1 201 Created
< Cache-Control: no-cache
Cache-Control: no-cache
< Expires: Fri, 26 Apr 2013 09:01:38 GMT
Expires: Fri, 26 Apr 2013 09:01:38 GMT
< Date: Fri, 26 Apr 2013 09:01:38 GMT
Date: Fri, 26 Apr 2013 09:01:38 GMT
< Pragma: no-cache
Pragma: no-cache
< Expires: Fri, 26 Apr 2013 09:01:38 GMT
Expires: Fri, 26 Apr 2013 09:01:38 GMT
< Date: Fri, 26 Apr 2013 09:01:38 GMT
Date: Fri, 26 Apr 2013 09:01:38 GMT
< Pragma: no-cache
Pragma: no-cache
< Location: webhdfs://hadoop-meta1:50070/user/ha/s3distcp.jar
Location: webhdfs://hadoop-meta1:50070/user/ha/s3distcp.jar
< Content-Type: application/octet-stream
Content-Type: application/octet-stream
< Content-Length: 0
Content-Length: 0
< Server: Jetty(6.1.26.cloudera.2)
Server: Jetty(6.1.26.cloudera.2)

<
* Connection #0 to host hadoop-fullslot1 left intact
* Closing connection #0
ha@hadoop-fullslot1:~$

最佳答案

您的变量包含的内容(垃圾)不仅仅是 URL。我猜想在 CR 字节或其他东西上,看看“HTTP/1.1”是如何首先打印在行上的,尽管它应该在 URL 的右侧......

关于linux - 在 bash 变量中保存 URL 会导致 curl 失败,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/16232623/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com