gpt4 book ai didi

asp.net-mvc - 使用 C# (ASP.NET MVC) 上传 csv 文件

转载 作者:行者123 更新时间:2023-12-01 22:42:46 27 4
gpt4 key购买 nike

我有一个 CSV 文件,其中包含以下内容:

ProductName,EmployeeID,EmployeeName,ContactNo,Adddress
iPad,1233,Tom,89897898,34 Pitt st
iPad,1573,Jack,8978 9689,50 George st
iPad,1893,Peter,8878 8989,32 Martin st

以下代码将插入到一个表中。我想要实现的是插入到 2 个表中:

Product table (parent table)ProductId(Pk), ProductNameEmployee Table (child table)EmployeeId(Pk), ProductId(fk), EmployeeName, ContactNo, Address

So I need to basically insert the record first into Product table and then into Employee table from the CSV file.

Controller.cs

[HttpPost]
public ActionResult Index(HttpPostedFileBase FileUpload)
{
// Set up DataTable place holder

Guid ProductId= Guid.NewGuid();
using (SqlConnection conn = new SqlConnection(connString))
{
conn.Open();

using (SqlCommand cmd = new SqlCommand(
"INSERT INTO Product VALUES(" + "@ReferralListID, @ProductName)", conn))
{
//Note product name need to read from csv file
cmd.Parameters.AddWithValue("@ProductId", ProductId);
cmd.Parameters.AddWithValue("@ProductName", ProductName);

int rows = cmd.ExecuteNonQuery();

//rows number of record got inserted
}
}

DataTable dt = new DataTable();

//check we have a file
if (FileUpload.ContentLength > 0)
{
//Workout our file path
string fileName = Path.GetFileName(FileUpload.FileName);
string path = Path.Combine(Server.MapPath("~/App_Data/uploads"), fileName);

//Try and upload
try
{
FileUpload.SaveAs(path);
//Process the CSV file and capture the results to our DataTable place holder
dt = ProcessCSV(path);

//Process the DataTable and capture the results to our SQL Bulk copy
ViewData["Feedback"] = ProcessBulkCopy(dt);
}
catch (Exception ex)
{
//Catch errors
ViewData["Feedback"] = ex.Message;
}
}
else
{
//Catch errors
ViewData["Feedback"] = "Please select a file";
}

//Tidy up
dt.Dispose();

return View("Index", ViewData["Feedback"]);
}

/// <summary>
/// Process the file supplied and process the CSV to a dynamic datatable
/// </summary>
/// <param name="fileName">String</param>
/// <returns>DataTable</returns>
private static DataTable ProcessCSV(string fileName)
{
//Set up our variables
string Feedback = string.Empty;
string line = string.Empty;
string[] strArray;
DataTable dt = new DataTable();
DataRow row;

// work out where we should split on comma, but not in a sentance
Regex r = new Regex(",(?=(?:[^\"]*\"[^\"]*\")*(?![^\"]*\"))");

//Set the filename in to our stream
StreamReader sr = new StreamReader(fileName);

//Read the first line and split the string at , with our regular express in to an array
line = sr.ReadLine();
strArray = r.Split(line);

//For each item in the new split array, dynamically builds our Data columns. Save us having to worry about it.
Array.ForEach(strArray, s => dt.Columns.Add(new DataColumn()));


//Read each line in the CVS file until it's empty
while ((line = sr.ReadLine()) != null)
{
row = dt.NewRow();

//add our current value to our data row
row.ItemArray = r.Split(line);
dt.Rows.Add(row);
}

//Tidy Streameader up
sr.Dispose();

//return a the new DataTable
return dt;


}

/// <summary>
/// Take the DataTable and using WriteToServer(DataTable) send it all to the database table "BulkImportDetails" in one go
/// </summary>
/// <param name="dt">DataTable</param>
/// <returns>String</returns>
private static String ProcessBulkCopy(DataTable dt)
{
string Feedback = string.Empty;
string connString = ConfigurationManager.ConnectionStrings["DataBaseConnectionString"].ConnectionString;

//make our connection and dispose at the end
using( SqlConnection conn = new SqlConnection(connString))
{
//make our command and dispose at the end
using (var copy = new SqlBulkCopy(conn))
{
//Open our connection
conn.Open();

//Set target table and tell the number of rows
copy.DestinationTableName = "Employee";
copy.BatchSize = dt.Rows.Count;
try
{
//Send it to the server
copy.WriteToServer(dt);
Feedback = "Upload complete";
}
catch (Exception ex)
{
Feedback = ex.Message;
}
}
}

return Feedback;
}

View.aspx

<asp:Content ID="Content1" ContentPlaceHolderID="TitleContent" runat="server">
Home Page
</asp:Content>

<asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server">

<h2>CSV Bulk Upload</h2>

<% using (Html.BeginForm("","",FormMethod.Post, new {enctype="multipart/form-data"})){ %>

<input type="file" name="FileUpload" />
<input type="submit" name="Submit" id="Submit" value="Upload" />
<% } %>

<p><%= Html.Encode(ViewData["Feedback"]) %></p>
</asp:Content>

存储过程

USE [BULkDatabase]
GO


SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER OFF
GO



CREATE PROCEDURE [dbo].[InsertProdutInfo]
(
@ProductName varchar (50),
@EmployeeName varchar (50),
@EmployeeAddress varchar (50)
)

AS


BEGIN TRAN

update [dbo.Product]
set [ProductName] = @ProductName
where [ProductName] = @ProductName;

-- get product id
select ProductId = [ProductId]
from [dbo.Product]
where [ProductName] = @ProductName;

if @@rowcount = 0
BEGIN TRAN

DECLARE @ProductId uniqueidentifier
-- there's no such product, let's create it
insert into [dbo.Product]
values (NEWID(),@ProductName);

select @ProductId = SCOPE_IDENTITY()
end

-- now that we know we have added the product and have the id, let's add the rest
insert into [dbo.Employees]
values (NEWID(), @EmployeeName, @EmployeeAddress, @ProductId);

COMMIT TRAN

最佳答案

首先,您应该将 Controller 与数据库代码分离,只需创建一个新的类项目并在那里托管所有数据库访问,这样您就可以在 Controller 中拥有如下内容:

[HttpPost]
public ActionResult UploadFile(HttpPostedFileBase FileUpload)
{
if (FileUpload.ContentLength > 0) {
// there's a file that needs our attention
var success = db.UploadProductFile(FileUpload);

// was everything ok?
if (success)
return View("UploadSuccess");
else
return View("UploadFail");
}

return RedirectToAction("Index", new { error = "Please upload a file..." });
}

public ActionResult Index(string error)
{
...
}

这样, Controller 并不真正关心您如何处理上传的文件,因为 Controller 不关心知道这样的事情,它的任务是知道它需要委派该工作并处理结果,仅此而已。

请注意操作方法名为 UploadFile 而不是 Index。发布到同一操作并不是一个好习惯,以避免在用户刷新页面时再次发布。

我还建议你使用ADO.NET实体模型,那里有很多视频,在ASP.NET网站上也是如此,它将极大地帮助你更简单地使用数据库和干净的方式。

回到你的问题...在你的Database类中,方法UploadProductFile应该是类似的,假设你没有超过200条记录来处理,最好使用memory to deal with the file而不是花时间再次保存和读取(更多信息,您应该保存文件并处理它,就像您已经做的那样):

private bool UploadProductFile(HttpPostedFileBase FileUpload)
{
// get the file stream in a readable way
StreamReader reader = new StreamReader(FileUpload.InputStream);

// get a DataTable representing the passed string
System.Data.DataTable dt = ProcessCSV(reader.ReadToEnd());

// for each row, compose the statement
bool success = true;
foreach (System.Data.DataRow row in dt.Rows)
success = db.InsertProdutInfo(row);

return success;
}

方法InsertProdutInfo会触发一个store procedure,类似于:

declare @product_key int

begin tran

update [tbl_products]
set [name] = @product_name, [last_update] = getdate()
where [name] = @product_name;

-- get product id
select @product_key = [id]
from [tbl_products]
where [name] = @product_name;

if @@rowcount = 0
begin
-- there's no such product, let's create it
insert into [tbl_products] (name, last_update)
values (@product_name, getdate());

select @product_key = SCOPE_IDENTITY()
end

-- now that we know we have added the product and have the id, let's add the rest
insert into [tbl_Employees] (id, product_id, name, contact, address)
values (@employee_id, @product_key, @employee_name,
@employee_contact, @employee_address);

commit tran

这样你就会拥有你需要的一切。

关于asp.net-mvc - 使用 C# (ASP.NET MVC) 上传 csv 文件,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/13963253/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com