您好,登錄后才能下訂單哦!
小編給大家分享一下MySQL中如何使用GROUP BY分組取字段最大值,相信大部分人都還不怎么了解,因此分享這篇文章給大家參考一下,希望大家閱讀完這篇文章后大有收獲,下面讓我們一起去了解一下吧!
假設有一個業務場景,需要查詢用戶登錄記錄信息,其中表結構如下:
CREATE TABLE `tb` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`uid` int(11) NOT NULL,
`ip` varchar(16) NOT NULL,
`login_time` datetime,
PRIMARY KEY (`id`),
KEY (`uid`)
);
再來點測試數據:
INSERT INTO tb SELECT null, 1001, '192.168.1.1', '2016-01-01 16:30:47';
INSERT INTO tb SELECT null, 1003, '192.168.1.153', '2016-01-01 19:30:51';
INSERT INTO tb SELECT null, 1001, '192.168.1.61', '2016-01-01 16:50:41';
INSERT INTO tb SELECT null, 1002, '192.168.1.31', '2016-01-01 18:30:21';
INSERT INTO tb SELECT null, 1002, '192.168.1.66', '2016-01-01 19:12:32';
INSERT INTO tb SELECT null, 1001, '192.168.1.81', '2016-01-01 19:53:09';
INSERT INTO tb SELECT null, 1001, '192.168.1.231', '2016-01-01 19:55:34';
表數據情況:
+----+------+---------------+---------------------+
| id | uid | ip | login_time |
+----+------+---------------+---------------------+
| 1 | 1001 | 192.168.1.1 | 2016-01-01 16:30:47 |
| 2 | 1003 | 192.168.1.153 | 2016-01-01 19:30:51 |
| 3 | 1001 | 192.168.1.61 | 2016-01-01 16:50:41 |
| 4 | 1002 | 192.168.1.31 | 2016-01-01 18:30:21 |
| 5 | 1002 | 192.168.1.66 | 2016-01-01 19:12:32 |
| 6 | 1001 | 192.168.1.81 | 2016-01-01 19:53:09 |
| 7 | 1001 | 192.168.1.231 | 2016-01-01 19:55:34 |
+----+------+---------------+---------------------+
如果只需要針對用戶查出其最后登錄的時間,可以簡單寫出:
SELECT uid, max(login_time)
FROM tb
GROUP BY uid;
+------+---------------------+
| uid | max(login_time) |
+------+---------------------+
| 1001 | 2016-01-01 19:55:34 |
| 1002 | 2016-01-01 19:12:32 |
| 1003 | 2016-01-01 19:30:51 |
+------+---------------------+
若還需要查詢用戶最后登錄時的其他信息,就不能用這種sql寫了:
-- 錯誤寫法
SELECT uid, ip, max(login_time)
FROM tb
GROUP BY uid;
-- 錯誤寫法
這樣的語句是非SQL標準的,雖然能夠在MySQL數據庫中執行成功,但返回的卻是未知的
(如果sql_mode開啟了only_full_group_by,則不會執行成功。)
可能ip字段會取uid分組前的第一個row的值,顯然不是所需信息
寫法①
那么寫一個子查詢吧:
SELECT a.uid, a.ip, a.login_time
FROM tb a
WHERE a.login_time in (
SELECT max(login_time)
FROM tb
GROUP BY uid);
寫法②
再或者換一個寫法:
SELECT a.uid, a.ip, a.login_time
FROM tb a
WHERE a.login_time = (
SELECT max(login_time)
FROM tb
WHERE a.uid = uid);
順便測了一下
在5.6以前的版本中,寫法②這條sql在大數據量的情況下,執行計劃不理想,目測性能不佳。
在5.6及以后的版本中,寫法②這條sql會快很多,執行計劃也有了改變
5.5.50:
+----+--------------------+-------+------+---------------+------+---------+------+------+-------------+
| id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra |
+----+--------------------+-------+------+---------------+------+---------+------+------+-------------+
| 1 | PRIMARY | a | ALL | NULL | NULL | NULL | NULL | 7 | Using where |
| 2 | DEPENDENT SUBQUERY | tb | ALL | uid | NULL | NULL | NULL | 7 | Using where |
+----+--------------------+-------+------+---------------+------+---------+------+------+-------------+
5.6.30:
+----+--------------------+-------+------+---------------+------+---------+------------+------+-------------+
| id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra |
+----+--------------------+-------+------+---------------+------+---------+------------+------+-------------+
| 1 | PRIMARY | a | ALL | NULL | NULL | NULL | NULL | 7 | Using where |
| 2 | DEPENDENT SUBQUERY | tb | ref | uid | uid | 4 | test.a.uid | 1 | NULL |
+----+--------------------+-------+------+---------------+------+---------+------------+------+-------------+
寫法③
索性直接改成join性能會更加好:
SELECT a.uid, a.ip, a.login_time
FROM (SELECT uid, max(login_time) login_time
FROM tb
GROUP BY uid
) b JOIN tb a ON a.uid = b.uid AND a.login_time = b.login_time;
當然,結果都相同:
+------+---------------+---------------------+
| uid | ip | login_time |
+------+---------------+---------------------+
| 1003 | 192.168.1.153 | 2016-01-01 19:30:51 |
| 1002 | 192.168.1.66 | 2016-01-01 19:12:32 |
| 1001 | 192.168.1.231 | 2016-01-01 19:55:34 |
+------+---------------+---------------------+
當然……如果要分組取最小值直接改對應函數和符號就行了。
以上是“MySQL中如何使用GROUP BY分組取字段最大值”這篇文章的所有內容,感謝各位的閱讀!相信大家都有了一定的了解,希望分享的內容對大家有所幫助,如果還想學習更多知識,歡迎關注億速云行業資訊頻道!
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。