gpt4 book ai didi

java - SuperCSV 的输出不可读?

转载 作者:行者123 更新时间:2023-11-30 09:33:06 24 4
gpt4 key购买 nike

我有一个实用程序类,我为我的 Spring Controller 创建调用以使用 SuperCSV 库(http://supercsv.sourceforge.net/)从一组 bean 生成 CSV

实用类非常基础:

public static void export2CSV(HttpServletResponse response,
String[] header, String filePrefix, List<? extends Object> dataObjs) {
try{
response.setContentType("text/csv;charset=utf-8");
response.setHeader("Content-Disposition","attachment; filename="+filePrefix+"_Data.csv");

OutputStream fout= response.getOutputStream();
OutputStream bos = new BufferedOutputStream(fout);
OutputStreamWriter outputwriter = new OutputStreamWriter(bos);

ICsvBeanWriter writer = new CsvBeanWriter(outputwriter, CsvPreference.EXCEL_PREFERENCE);

// the actual writing
writer.writeHeader(header);

for(Object anObj : dataObjs){
writer.write(anObj, header);
}
}catch (Exception e){
e.printStackTrace();
}
};

要注意的是,我从这个操作中得到了不同的行为,我不知道为什么。当我从一个 Controller (我们称之为“A”)调用它时,我得到了预期的数据输出。

当我从另一个 Controller (“B”)调用它时,我得到一小段无法识别的二进制数据,这些数据无法通过 OO Calc 打开。在 Notepad++ 中打开它会产生一行不可读的乱码,我只能假设这是读者试图向我展示一个二进制流。

Controller “A”调用(有效的那个)

@RequestMapping(value="/getFullReqData.html", method = RequestMethod.GET)
public void getFullData(HttpSession session, HttpServletRequest request, HttpServletResponse response) throws IOException{
logger.info("INFO: ******************************Received request for full Req data dump");
String projName= (String)session.getAttribute("currentProject");
int projectID = ProjectService.getProjectID(projName);
List<Requirement> allRecords = reqService.getFullDataSet(projectID);

final String[] header = new String[] {
"ColumnA",
"ColumnB",
"ColumnC",
"ColumnD",
"ColumnE"
};

CSVExporter.export2CSV(response, header, projName+"_reqs_", allRecords);


};

...这是 Controller “B”调用(失败的调用):

@RequestMapping(value="/getFullTCData.html", method = RequestMethod.GET)
public void getFullData(HttpSession session, HttpServletRequest request, HttpServletResponse response) throws IOException{
logger.info("INFO: Received request for full TCD data dump");
String projName= (String)session.getAttribute("currentProject");
int projectID = ProjectService.getProjectID(projName);
List<TestCase> allRecords = testService.getFullTestCaseList(projectID);

final String[] header = new String[] {
"ColumnW",
"ColumnX",
"ColumnY",
"ColumnZ"
};

CSVExporter.export2CSV(response, header, projName+"_tcs_", allRecords);
}

观察:

  • 我先调用哪个 Controller 是无关紧要的。 “A”总是有效,“B”总是产生乱码
  • 对该函数的两次调用都有一个标题列列表,这些标题列是传递给 CSVWriter 的 bean 中定义的全部操作集的子集
  • 简单的异常 printStackTrace 用于检测 bean 的反射字段何时与定义不匹配(即找不到 get() 以编程方式获取值),表明所有列/变量匹配都成功。<
  • 在调试器中,我已经根据传递的对象数量验证了 writer.write(Object, header) 调用达到了预期的次数,并且这些对象具有预期的数据

如有任何建议或见解,我们将不胜感激。我真的很难过如何更好地隔离问题......

最佳答案

您并没有关闭作者。此外,CsvBeanWriter 会将编写器包装在 BufferedWriter 中,因此您也可以简化您的 outputwriter

public static void export2CSV(HttpServletResponse response,
String[] header, String filePrefix, List<? extends Object> dataObjs) {
ICsvBeanWriter writer;
try{
response.setContentType("text/csv;charset=utf-8");
response.setHeader("Content-Disposition",
"attachment; filename="+filePrefix+"_Data.csv");

OutputStreamWriter outputwriter =
new OutputStreamWriter(response.getOutputStream());

writer = new CsvBeanWriter(outputwriter, CsvPreference.EXCEL_PREFERENCE);

// the actual writing
writer.writeHeader(header);

for(Object anObj : dataObjs){
writer.write(anObj, header);
}
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
writer.close(); // closes writer and underlying stream
} catch (Exception e){}
}
};

Super CSV 2.0.0-beta-1现在出来了!除了添加许多其他功能(包括 Maven 支持和新的 Dozer 扩展),CSV 编写器现在还公开了一个 flush() 方法。

关于java - SuperCSV 的输出不可读?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/12358184/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com