gpt4 book ai didi

php - MYSQL 获取所有分组结果一一查询并包含行数

转载 作者:行者123 更新时间:2023-11-29 11:07:54 25 4
gpt4 key购买 nike

所以我到处寻找这个,但我找不到这个特定的关系。下面的查询返回用户的所有帖子以及喜欢该帖子的人数,因为我使用了 GROUP BY postid ,这意味着如果结果重复但具有相同的 postid > 然后将它们分组。

<小时/>
SELECT posts.id postid,posts.post_body,posts.post_type, ALLUSERS.USERNAME,
likes.liker,likes.target,
plikers.*,
COUNT(posts.id) numberOflikes

FROM posts
INNER JOIN ALLUSERS ON(ALLUSERS.USERID=posts.FROM_userid)
LEFT JOIN likes ON(likes.target=posts.id)
LEFT JOIN(SELECT USERID pl_id FROM ALLUSERS )plikers ON(pl_id=likes.liker)
GROUP BY postid

结果是......

+--------+-----------------+------------------------+-----------+-------+--------+-------+-----------+---------------+
| postid | post_body | post_type | USERNAME | liker | target | pl_id | pl_un | numberOflikes |
+--------+-----------------+------------------------+-----------+-------+--------+-------+-----------+---------------+
| 83 | Southgate | 20&&03 Saturday/04:05 | Superuser | NULL | NULL | NULL | NULL | 1 |
| 84 | Great post! | 10&&03 Saturday/04:07 | Superuser | 4 | 84 | 4 | dennisrec | 7 |
| 85 | How delightful? | 10&&03 Saturday/04:07 | Superuser | 43 | 85 | 43 | zerCon | 1 |
| 87 | Cheers... | 10&&07 Wednesday/01:53 | Superuser | NULL | NULL | NULL | NULL | 1 |
| 88 | check this out! | 20&&07 Wednesday/03:31 | Superuser | NULL | NULL | NULL | NULL | 1 |
+--------+-----------------+------------------------+-----------+-------+--------+-------+-----------+---------------+

这是正确的,但这只返回分组的第一个结果。因此,问题是,有没有办法在一个查询中返回所有组的所有结果

现在显然我可以删除GROUP BYcount(*)短语,然后获取多个重复结果然后过滤它们以获得帖子的喜欢者的所有详细信息,但这肯定会减慢我的服务器速度。所以我已经尝试过了。

任何帮助将不胜感激。

最佳答案

如果你的模型看起来像这样

MariaDB [sandbox]> select * from posts;
+------+-----------+-----------+-------------+
| id | post_body | post_type | from_userid |
+------+-----------+-----------+-------------+
| 1 | POST1 | NULL | 1 |
| 2 | POST2 | NULL | 2 |
+------+-----------+-----------+-------------+
2 rows in set (0.00 sec)

MariaDB [sandbox]> select * from likes;
+------+--------+-------+
| id | TARGET | liker |
+------+--------+-------+
| 1 | 1 | 3 |
| 2 | 1 | 7 |
| 3 | 2 | 8 |
| 3 | 2 | 6 |
+------+--------+-------+
4 rows in set (0.00 sec)

MariaDB [sandbox]> select * from users where id < 9;
+----+----------+-----------+--------+---------------------+
| id | userName | photo | status | ts |
+----+----------+-----------+--------+---------------------+
| 1 | John | john.png | 1 | 2016-12-08 13:14:24 |
| 2 | Jane | jane.png | 1 | 2016-12-08 13:14:24 |
| 3 | Ali | | 1 | 2016-12-08 13:14:24 |
| 6 | Bruce | bruce.png | 1 | 2016-12-08 13:14:24 |
| 7 | Martha | | 1 | 2016-12-08 13:14:24 |
| 8 | Sidney | | 1 | 2016-12-08 13:14:24 |
+----+----------+-----------+--------+---------------------+
6 rows in set (0.00 sec)

然后正如@1000111建议的那样

MariaDB [sandbox]> SELECT  posts.id postid,posts.post_body,posts.post_type,POSTS.FROM_USERID
-> , USERS.USERNAME
-> ,GROUP_CONCAT(likes.liker) LIKER
-> ,likes.target
-> ,GROUP_CONCAT(plikers.pl_id) pl_id
-> ,GROUP_CONCAT(plikers.UNAME) pl_un
-> ,COUNT(posts.id) numberOflikes
->
-> FROM posts
-> INNER JOIN USERS ON USERS.ID=posts.FROM_userid
-> LEFT JOIN likes ON likes.target=posts.id
-> LEFT JOIN(SELECT ID pl_id, USERNAME UNAME FROM USERS )plikers ON pl_id=likes.liker
-> GROUP BY postid;
+--------+-----------+-----------+-------------+----------+-------+--------+-------+--------------+---------------+
| postid | post_body | post_type | FROM_USERID | USERNAME | LIKER | target | pl_id | pl_un | numberOflikes |
+--------+-----------+-----------+-------------+----------+-------+--------+-------+--------------+---------------+
| 1 | POST1 | NULL | 1 | John | 7,3 | 1 | 7,3 | Martha,Ali | 2 |
| 2 | POST2 | NULL | 2 | Jane | 6,8 | 2 | 6,8 | Bruce,Sidney | 2 |
+--------+-----------+-----------+-------------+----------+-------+--------+-------+--------------+---------------+
2 rows in set (0.00 sec)

但你应该注意警告

关于php - MYSQL 获取所有分组结果一一查询并包含行数,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41094391/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com