gpt4 book ai didi

python - 列数会影响sqlalchemy的速度吗?

转载 作者:行者123 更新时间:2023-11-29 05:35:09 25 4
gpt4 key购买 nike

我用sqlalchemy(python 2.7)创建了两个表,数据库是mysql 5.5。以下是我的代码:

engine = create_engine('mysql://root:123@localhost/test')

metadata = MetaData()

conn = engin.connect()

# For table 1:

columns = []

for i in xrange(100):

columns.append(Column('c%d' % i, TINYINT, nullable = False, server_default = '0'))
columns.append(Column('d%d' % i, SmallInteger, nullable = False, server_default = '0'))

user = Table('user', metadata, *columns)
# So user has 100 tinyint columns and 100 smallint columns.

# For table 2:

user2 = Table('user2', metadata,

Column('c0', BINARY(100), nullable = False, server_default='\0'*100),
Column('d0', BINARY(200), nullable = False, server_default='\0'*200),
)

# user2 has two columns contains 100 bytes and 200 bytes respectively.

I then inserted 4000 rows into each table. Since these two tables have same row length, I
expect the select speed will be almost the same. I ran the following test code:

s1 = select([user]).compile(engine)

s2 = select([user2]).compile(engine)

t1 = time()

result = conn.execute(s1).fetchall()

print 't1:', time() - t1

t2 = time()

result = conn.execute(s2).fetchall()

print 't2', time() - t2

The result is :

t1: 0.5120000

t2: 0.0149999

这是否意味着表中的列数会显着影响性能SQLAlchemy 的?提前致谢!

最佳答案

Does this means the number of columns in table will dramatically affect the performance of SQLAlchemy?

好吧,那是一个艰难的过程,它可能更多地取决于底层的 SQL 引擎,在这种情况下是 MySQL,然后实际上是 sqlalchemy,这只不过是一种方式在使用相同界面的同时与不同的数据库引擎进行交互。

SQLAlchemy is the Python SQL toolkit and Object Relational Mapper that gives application developers the full power and flexibility of SQL.

It provides a full suite of well known enterprise-level persistence patterns, designed for efficient and high-performing database access, adapted into a simple and Pythonic domain language.

虽然我可能是错的,但您可以尝试使用常规 SQL 对其进行基准测试。

我实际上运行了一些测试......

import timeit

setup = """
from sqlalchemy import create_engine, MetaData, select, Table, Column
from sqlalchemy.dialects.sqlite import BOOLEAN, SMALLINT, VARCHAR
engine = create_engine('sqlite://', echo = False)
metadata = MetaData()
conn = engine.connect()
columns = []

for i in xrange(100):
columns.append(Column('c%d' % i, VARCHAR(1), nullable = False, server_default = '0'))
columns.append(Column('d%d' % i, VARCHAR(2), nullable = False, server_default = '00'))


user = Table('user', metadata, *columns)
user.create(engine)
conn.execute(user.insert(), [{}] * 4000)

user2 = Table('user2', metadata, Column('c0', VARCHAR(100), nullable = False, server_default = '0' * 100), \
Column('d0', VARCHAR(200), nullable = False, server_default = '0' * 200))
user2.create(engine)
conn.execute(user2.insert(), [{}] * 4000)
"""

many_columns = """
s1 = select([user]).compile(engine)
result = conn.execute(s1).fetchall()
"""

two_columns = """
s2 = select([user2]).compile(engine)
result = conn.execute(s2).fetchall()
"""

raw_many_columns = "res = conn.execute('SELECT * FROM user').fetchall()"
raw_two_columns = "res = conn.execute('SELECT * FROM user2').fetchall()"

timeit.Timer(two_columns, setup).timeit(number = 1)
timeit.Timer(raw_two_columns, setup).timeit(number = 1)
timeit.Timer(many_columns, setup).timeit(number = 1)
timeit.Timer(raw_many_columns, setup).timeit(number = 1)

>>> timeit.Timer(two_columns, setup).timeit(number = 1)
0.010751008987426758
>>> timeit.Timer(raw_two_columns, setup).timeit(number = 1)
0.0099620819091796875
>>> timeit.Timer(many_columns, setup).timeit(number = 1)
0.23563408851623535
>>> timeit.Timer(raw_many_columns, setup).timeit(number = 1)
0.21881699562072754

我确实找到了这个:
http://www.mysqlperformanceblog.com/2009/09/28/how-number-of-columns-affects-performance/

这有点有趣,尽管他使用 max 进行测试......

我真的很喜欢 sqlalchemy,所以我决定用 python 自带的 sqlite3 模块来比较它

import timeit
setup = """
import sqlite3
conn = sqlite3.connect(':memory:')
c = conn.cursor()

c.execute('CREATE TABLE user (%s)' %\
("".join(("c%i VARCHAR(1) DEFAULT '0' NOT NULL, d%i VARCHAR(2) DEFAULT '00' NOT NULL," % (index, index) for index in xrange(99))) +\
"c99 VARCHAR(1) DEFAULT '0' NOT NULL, d99 VARCHAR(2) DEFAULT '0' NOT NULL"))

c.execute("CREATE TABLE user2 (c0 VARCHAR(100) DEFAULT '%s' NOT NULL, d0 VARCHAR(200) DEFAULT '%s' NOT NULL)" % ('0'* 100, '0'*200))

conn.commit()
c.executemany('INSERT INTO user VALUES (%s)' % ('?,' * 199 + '?'), [('0',) * 200] * 4000)
c.executemany('INSERT INTO user2 VALUES (?,?)', [('0'*100, '0'*200)] * 4000)
conn.commit()
"""

many_columns = """
r = c.execute('SELECT * FROM user')
all = r.fetchall()
"""

two_columns = """
r2 = c.execute('SELECT * FROM user2')
all = r2.fetchall()
"""

timeit.Timer(many_columns, setup).timeit(number = 1)
timeit.Timer(two_columns, setup).timeit(number = 1)

>>> timeit.Timer(many_columns, setup).timeit(number = 1)
0.21009302139282227
>>> timeit.Timer(two_columns, setup).timeit(number = 1)
0.0083379745483398438

并得出了相同的结果,所以我真的认为它是数据库实现而不是 sqlalchemy 问题。

默认插入

import timeit

setup = """
from sqlalchemy import create_engine, MetaData, select, Table, Column
from sqlalchemy.dialects.sqlite import BOOLEAN, SMALLINT, VARCHAR
engine = create_engine('sqlite://', echo = False)
metadata = MetaData()
conn = engine.connect()
columns = []

for i in xrange(100):
columns.append(Column('c%d' % i, VARCHAR(1), nullable = False, server_default = '0'))
columns.append(Column('d%d' % i, VARCHAR(2), nullable = False, server_default = '00'))


user = Table('user', metadata, *columns)
user.create(engine)

user2 = Table('user2', metadata, Column('c0', VARCHAR(100), nullable = False, server_default = '0' * 100), \
Column('d0', VARCHAR(200), nullable = False, server_default = '0' * 200))
user2.create(engine)
"""

many_columns = """
conn.execute(user.insert(), [{}] * 4000)
"""

two_columns = """
conn.execute(user2.insert(), [{}] * 4000)
"""

>>> timeit.Timer(two_columns, setup).timeit(number = 1)
0.017949104309082031
>>> timeit.Timer(many_columns, setup).timeit(number = 1)
0.047809123992919922

使用 sqlite3 模块进行测试。

import timeit
setup = """
import sqlite3
conn = sqlite3.connect(':memory:')
c = conn.cursor()

c.execute('CREATE TABLE user (%s)' %\
("".join(("c%i VARCHAR(1) DEFAULT '0' NOT NULL, d%i VARCHAR(2) DEFAULT '00' NOT NULL," % (index, index) for index in xrange(99))) +\
"c99 VARCHAR(1) DEFAULT '0' NOT NULL, d99 VARCHAR(2) DEFAULT '0' NOT NULL"))

c.execute("CREATE TABLE user2 (c0 VARCHAR(100) DEFAULT '%s' NOT NULL, d0 VARCHAR(200) DEFAULT '%s' NOT NULL)" % ('0'* 100, '0'*200))
conn.commit()
"""

many_columns = """
c.executemany('INSERT INTO user VALUES (%s)' % ('?,' * 199 + '?'), [('0', '00') * 100] * 4000)
conn.commit()
"""

two_columns = """
c.executemany('INSERT INTO user2 VALUES (?,?)', [('0'*100, '0'*200)] * 4000)
conn.commit()
"""

timeit.Timer(many_columns, setup).timeit(number = 1)
timeit.Timer(two_columns, setup).timeit(number = 1)

>>> timeit.Timer(many_columns, setup).timeit(number = 1)
0.14044189453125
>>> timeit.Timer(two_columns, setup).timeit(number = 1)
0.014360189437866211
>>>

关于python - 列数会影响sqlalchemy的速度吗?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/11388988/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com