gpt4 book ai didi

mysql - 查询MySQL记录的变化

转载 作者:行者123 更新时间:2023-11-29 05:24:49 25 4
gpt4 key购买 nike

我有一个这样的表:

id | status | user_id | created_at
---:--------:---------:--------------------
1 | 0 | 1 | 2014-01-05 07:23:15
2 | 1 | 1 | 2014-01-05 07:23:16
3 | 1 | 1 | 2014-01-05 07:23:17
4 | 0 | 1 | 2014-01-05 07:23:18
5 | 0 | 1 | 2014-01-05 07:23:19
6 | 1 | 1 | 2014-01-05 07:23:20
7 | 0 | 2 | 2014-01-05 07:23:21
8 | 0 | 1 | 2014-01-05 07:23:22
9 | 0 | 2 | 2014-01-05 07:23:23
10 | 1 | 2 | 2014-01-05 07:23:24
11 | 0 | 2 | 2014-01-05 07:23:25
12 | 1 | 2 | 2014-01-05 07:23:26

我想查询 status 字段的变化,按 user_id 分组,总是获取最后的状态(基于 created_at ).查询的结果应该是这样的:

id | status | user_id | created_at
---:--------:---------:--------------------
1 | 0 | 1 | 2014-01-05 07:23:15
3 | 1 | 1 | 2014-01-05 07:23:17
5 | 0 | 1 | 2014-01-05 07:23:19
6 | 1 | 1 | 2014-01-05 07:23:20
8 | 0 | 1 | 2014-01-05 07:23:22
9 | 0 | 2 | 2014-01-05 07:23:23
10 | 1 | 2 | 2014-01-05 07:23:24
11 | 0 | 2 | 2014-01-05 07:23:25
12 | 1 | 2 | 2014-01-05 07:23:26

有没有办法在这种情况下查询SQL的变化?这个查询应该怎么写?

最佳答案

考虑以下...

DROP TABLE IF EXISTS my_table;

CREATE TABLE my_table
(id INT NOT NULL AUTO_INCREMENT PRIMARY KEY
,status TINYINT NOT NULL DEFAULT 1
,user_id INT NOT NULL
,created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP
);

INSERT INTO my_table VALUES
(1 , 0 , 1 ,'2014-01-05 07:23:15'),
(2 , 1 , 1 ,'2014-01-05 07:23:16'),
(3 , 1 , 1 ,'2014-01-05 07:23:17'),
(4 , 0 , 1 ,'2014-01-05 07:23:18'),
(5 , 0 , 1 ,'2014-01-05 07:23:19'),
(6 , 1 , 1 ,'2014-01-05 07:23:20'),
(7 , 0 , 2 ,'2014-01-05 07:23:21'),
(8 , 0 , 1 ,'2014-01-05 07:23:22'),
(9 , 0 , 2 ,'2014-01-05 07:23:23'),
(10 , 1 , 2 ,'2014-01-05 07:23:24'),
(11 , 0 , 2 ,'2014-01-05 07:23:25'),
(12 , 1 , 2 ,'2014-01-05 07:23:26');

对于下面提供的解决方案,id 是否连续并不重要,重要的是它是连续的。我已将解决方案分解成多个部分,这样您就可以看到它在做什么...

第一部分按用户对结果进行排名...

SELECT x.* 
, COUNT(*) rank
FROM my_table x
JOIN my_table y
ON y.user_id = x.user_id
AND y.id <= x.id
GROUP
BY x.id
ORDER
BY x.user_id,rank;

+----+--------+---------+---------------------+------+
| id | status | user_id | created_at | rank |
+----+--------+---------+---------------------+------+
| 1 | 0 | 1 | 2014-01-05 07:23:15 | 1 |
| 2 | 1 | 1 | 2014-01-05 07:23:16 | 2 |
| 3 | 1 | 1 | 2014-01-05 07:23:17 | 3 |
| 4 | 0 | 1 | 2014-01-05 07:23:18 | 4 |
| 5 | 0 | 1 | 2014-01-05 07:23:19 | 5 |
| 6 | 1 | 1 | 2014-01-05 07:23:20 | 6 |
| 8 | 0 | 1 | 2014-01-05 07:23:22 | 7 |
| 7 | 0 | 2 | 2014-01-05 07:23:21 | 1 |
| 9 | 0 | 2 | 2014-01-05 07:23:23 | 2 |
| 10 | 1 | 2 | 2014-01-05 07:23:24 | 3 |
| 11 | 0 | 2 | 2014-01-05 07:23:25 | 4 |
| 12 | 1 | 2 | 2014-01-05 07:23:26 | 5 |
+----+--------+---------+---------------------+------+

第二部分将此查询与其自身结合起来,并突出显示异常...

SELECT a.*
, b.id
FROM
( SELECT x.*
, COUNT(*) rank
FROM my_table x
JOIN my_table y
ON y.user_id = x.user_id
AND y.id <= x.id
GROUP
BY x.id
) a
LEFT
JOIN
( SELECT x.*
, COUNT(*) rank
FROM my_table x
JOIN my_table y
ON y.user_id = x.user_id
AND y.id <= x.id
GROUP
BY x.id
) b
ON b.user_id = a.user_id
AND b.status = a.status
AND b.rank = a.rank + 1;


+----+--------+---------+---------------------+------+------+
| id | status | user_id | created_at | rank | id |
+----+--------+---------+---------------------+------+------+
| 1 | 0 | 1 | 2014-01-05 07:23:15 | 1 | NULL |
| 2 | 1 | 1 | 2014-01-05 07:23:16 | 2 | 3 |
| 3 | 1 | 1 | 2014-01-05 07:23:17 | 3 | NULL |
| 4 | 0 | 1 | 2014-01-05 07:23:18 | 4 | 5 |
| 5 | 0 | 1 | 2014-01-05 07:23:19 | 5 | NULL |
| 6 | 1 | 1 | 2014-01-05 07:23:20 | 6 | NULL |
| 7 | 0 | 2 | 2014-01-05 07:23:21 | 1 | 9 |
| 8 | 0 | 1 | 2014-01-05 07:23:22 | 7 | NULL |
| 9 | 0 | 2 | 2014-01-05 07:23:23 | 2 | NULL |
| 10 | 1 | 2 | 2014-01-05 07:23:24 | 3 | NULL |
| 11 | 0 | 2 | 2014-01-05 07:23:25 | 4 | NULL |
| 12 | 1 | 2 | 2014-01-05 07:23:26 | 5 | NULL |
+----+--------+---------+---------------------+------+------+

第三部分也是最后一部分特意留给读者作为练习,但是,此解决方案的一个缺点是它的扩展性不是特别好。

关于mysql - 查询MySQL记录的变化,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/21454817/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com