gpt4 book ai didi

azure - Golang azure blob 存储,0b blob 并覆盖下载的 blob 数据

转载 作者:IT王子 更新时间:2023-10-29 02:11:31 31 4
gpt4 key购买 nike

当前使用:https://github.com/Azure/azure-sdk-for-go

概述:我当前正在从 azure blob 存储中下载一个 blob,解析该 blob,然后将转录的 blob 上传回存储中另一个名为“filtered”的文件夹中。

问题:上传的blob不在过滤的文件夹中,而是在根目录中,并且blob为0B,没有数据。 Blob 上传似乎还破坏了我刚刚下载的 Blob,导致 Blob 为 0B,没有数据。下载 blob 工作正常,我能够获取数据的 []字节数组。

代码:

import (
"bufio"
"fmt"
"os"
"strings"
"strconv"
"math/big"
"bytes"
"io/ioutil"
"github.com/Azure/azure-sdk-for-go/storage"
"compress/gzip"
"encoding/base64"
"crypto/md5"
)

func main() {
var filter bool = true //check smart filter
test := 0
configfile, err := os.Open("config.txt") //open configfile
check(err) //check file opened
ConfigScanner := bufio.NewScanner(configfile) //open buffer
ConfigScanner.Scan() //get serial number
serialnum := ConfigScanner.Text()
configfile.Close() //close the config file
CanLUT := ParseDBC("file.dbc") //parse the associated DBC file
check(err) //check file opened
m := make(map[int64]string) //map of last seen message
//Azure API
client, err := storage.NewBasicClient(accountName, accountKey) //get client from azure
check(err)
bsc := client.GetBlobService() //access blob service
cnt := bsc.GetContainerReference("containerblob") //get container of the blob
LBP := storage.ListBlobsParameters{}
LBP.Prefix = "dev4/dev4" //only get blobs with dev4/dev4 prefix
list, err := cnt.ListBlobs(LBP) //get list of all matching blobs
check(err)
for _, b := range list.Blobs { //read all blobs from azure with prefix dev4/dev4
oa := make([]byte,0)
fmt.Println("getting blob: ",b.Name)
readCloser, err := b.Get(nil) //get blob data
check(err)
bytesRead, err := ioutil.ReadAll(readCloser) //read blob data to byte[]
check(err)
if len(bytesRead) < 1 {
continue
}
br := bytes.NewReader(bytesRead)
zr, err := gzip.NewReader(br) //use gzip reader for zipped data
check(err)
uz, err := ioutil.ReadAll(zr) //uz byte[] of unzipped file
check(err)
readCloser.Close() //close the reader
zr.Close() //close gzip reader
r := bytes.NewReader(uz)
scanner := bufio.NewScanner(r)
for scanner.Scan() { //loop on each line in the input file
temp := ParseToFrame(scanner.Text()) //parse the line into a usable struct
_, exists := m[temp.nodeid] //check if the frame has alread been seen and is stored in the hashmap
if exists { //if exists in the map
if ChkDuplicate(m, temp) { //is the msg a duplicate? if true the message isnt so add it
m[temp.nodeid] = temp.data //update the data to the hashmap
DecodeFrame(temp, &oa, CanLUT, filter, serialnum) //decode the frame and output it to another file
}
} else { //DNE in map so add it
m[temp.nodeid] = temp.data
DecodeFrame(temp, &oa, CanLUT,filter, serialnum) //decode the frame and output it to another file
}
}//end blob file
filestr := strings.Split(b.Name, "_")[1]
filestr = "filtered/filtered_" + filestr
var buffout bytes.Buffer
gz := gzip.NewWriter(&buffout)
_, err = gz.Write(oa)
check(err)
gz.Flush()
gz.Close()
compressedData := buffout.Bytes()
//push block blob to azure
fmt.Println("uploading: ",filestr)
clientnew, err := storage.NewBasicClient(accountName, accountKey) //get client from azure
check(err)
senderbsc := clientnew.GetBlobService() //access blob service
sendercnt := senderbsc.GetContainerReference("storeblob") //get container of store blob
bblob := sendercnt.GetBlobReference("filtered_" + strings.Split(b.Name, "/")[1])
err = bblob.CreateBlockBlob(nil)
check(err)
blockID := base64.StdEncoding.EncodeToString([]byte("00000"))
err = bblob.PutBlock(blockID, compressedData, nil)
check(err)
list, err := b.GetBlockList(storage.BlockListTypeUncommitted, nil)
check(err)
uncommittedBlocksList := make([]storage.Block, len(list.UncommittedBlocks))
for i := range list.UncommittedBlocks {
uncommittedBlocksList[i].ID = list.UncommittedBlocks[i].Name
uncommittedBlocksList[i].Status = storage.BlockStatusUncommitted
}
err = b.PutBlockList(uncommittedBlocksList, nil)
//check if upload was good.
CheckHash(&compressedData,filestr,sendercnt)
check(err)
if(test == 0){
break //test only read one file
}
test++
}//end for blobs
}//end main

最佳答案

正如@DavidMakogon所说,您可以使用API​​ CreateBlockBlobFromReader用于从任何读取器上传的 Azure Storage SDK for Go 实现了 Azure Blob 存储的接口(interface) io.Reader

这是我的示例代码如下。

accountName := "<your-storage-account-name>"
accountKey := "<your-storage-account-key>"
client, _ := storage.NewBasicClient(accountName, accountKey)
blobClinet := client.GetBlobService()
containerName := "mycontainer"
container := blobClinet.GetContainerReference(containerName)

// Two sample ways for uploading
// 1. Upload a text blob from string reader
blobName := "upload.txt"
blob := container.GetBlobReference(blobName)
strReader := strings.NewReader("upload text to blob from string reader")
blob.CreateBlockBlobFromReader(strReader, nil)

// 2. Upload a file from file reader
fileName := "hello.png"
file, _ := os.Open(fileName)
blobName := "hello.png"
blob := container.GetBlobReference(blobName)
blob.CreateBlockBlobFromReader(file, nil)

希望有帮助。

关于azure - Golang azure blob 存储,0b blob 并覆盖下载的 blob 数据,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/45470989/

31 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com