使用TiUP快速部署TiDB上手环境(在同一个主机上安装TiDB集群--使用TiUP Playground方式)
Tags: NewSQLTiDBTiUPTiUP Playground安装部署
参考:https://docs.pingcap.com/zh/tidb/stable/quick-start-with-tidb#Linux
其它内容可以参考:
【DB宝54】NewSQL数据库之TiDB简介 :https://www.xmmup.com/dbbao54newsqlshujukuzhitidbjianjie.html
【DB宝57】使用Docker-Compose快速部署TiDB集群环境:https://www.xmmup.com/dbbao57shiyongdocker-composekuaisubushutidbjiqunhuanjing.html
TiDB 是一个分布式系统的数据库,最基础的 TiDB 测试集群通常由 2 个 TiDB 实例、3 个 TiKV 实例、3 个 PD 实例和可选的 TiFlash 实例构成。通过 TiUP Playground,可以快速搭建出上述的一套基础测试集群。
注意:
- 以这种方式执行的 playground,在结束部署测试后 TiUP 会清理掉原集群数据,重新执行该命令后会得到一个全新的集群。
- 若希望持久化数据,可以执行 TiUP 的
--tag
参数:tiup --tag <your-tag> playground ...
,详情参考 TiUP 参考手册。
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 | docker rm -f lhrtidb docker run -d --name lhrtidb -h lhrtidb \ -p 44000-44001:4000-4001 -p 42379-42385:2379-2385 -p 49090:9090 -p 43000:3000 -p 43389:3389 \ -v /sys/fs/cgroup:/sys/fs/cgroup \ --privileged=true lhrbest/lhrcentos76:8.5 \ /usr/sbin/init docker exec -it lhrtidb bash -- 安装tiup curl --proto '=https' --tlsv1.2 -sSf https://tiup-mirrors.pingcap.com/install.sh | sh source /root/.bash_profile echo "export PATH=/root/.tiup/bin:$PATH" >> /root/.bashrc -- 运行最新版本的 TiDB 集群,其中 TiDB、TiKV、PD 和 TiFlash 实例各 1 个 tiup playground -- 或者,指定 TiDB 版本以及各组件实例个数: tiup playground v5.2.1 --db 2 --pd 3 --kv 3 --host=0.0.0.0 & tiup playground v5.3.0 --db 2 --pd 3 --kv 3 --host=0.0.0.0 & -- 配置参数 echo 'oom-action = "log"' > /tmp/tidb_config.toml tiup playground --tag lhrtidb v5.3.0 --db 2 --pd 3 --kv 3 --tiflash 1 --host=0.0.0.0 --db.config /tmp/tidb_config.toml & -- 状态查询 tiup status -- 查询安装的组件 tiup list --installed -- 使用 TiUP 连接 TiDB: tiup client -- 或者,使用MySQL客户端进行连接 yum install -y mariadb mariadb-libs mariadb-devel mysql --host 172.17.0.15 --port 4000 -u root mysql -uroot -P 44000 -h192.168.66.35 select tidb_version(); select version(); select STORE_ID,ADDRESS,STORE_STATE,STORE_STATE_NAME,CAPACITY,AVAILABLE,UPTIME from INFORMATION_SCHEMA.TIKV_STORE_STATUS; show config where name like '%oom-action%'; select * from INFORMATION_SCHEMA.cluster_info order by type,instance; -- 清理 TiDB 集群 tiup clean --all [root@lhrtidb /]# ps -ef|grep tiup root 6622 6444 0 14:24 pts/6 00:00:00 tiup playground --tag lhrtidb v5.3.0 --db 2 --pd 3 --kv 3 --host=0.0.0.0 --db.config /tmp/tidb_config.toml root 6634 6622 0 14:24 pts/6 00:00:01 /root/.tiup/components/playground/v1.8.1/tiup-playground --tag lhrtidb v5.3.0 --db 2 --pd 3 --kv 3 --host=0.0.0.0 --db.config /tmp/tidb_config.toml root 6651 6634 11 14:24 pts/6 00:01:13 /root/.tiup/components/pd/v5.3.0/pd-server --name=pd-0 --data-dir=/root/.tiup/data/lhrtidb/pd-0/data --peer-urls=http://172.17.0.4:2380 --advertise-peer-urls=http://172.17.0.4:2380 --client-urls=http://172.17.0.4:2379 --advertise-client-urls=http://172.17.0.4:2379 --log-file=/root/.tiup/data/lhrtidb/pd-0/pd.log --initial-cluster=pd-0=http://172.17.0.4:2380,pd-1=http://172.17.0.4:2381,pd-2=http://172.17.0.4:2383 root 6663 6634 3 14:24 pts/6 00:00:20 /root/.tiup/components/pd/v5.3.0/pd-server --name=pd-1 --data-dir=/root/.tiup/data/lhrtidb/pd-1/data --peer-urls=http://172.17.0.4:2381 --advertise-peer-urls=http://172.17.0.4:2381 --client-urls=http://172.17.0.4:2382 --advertise-client-urls=http://172.17.0.4:2382 --log-file=/root/.tiup/data/lhrtidb/pd-1/pd.log --initial-cluster=pd-0=http://172.17.0.4:2380,pd-1=http://172.17.0.4:2381,pd-2=http://172.17.0.4:2383 root 6677 6634 3 14:24 pts/6 00:00:20 /root/.tiup/components/pd/v5.3.0/pd-server --name=pd-2 --data-dir=/root/.tiup/data/lhrtidb/pd-2/data --peer-urls=http://172.17.0.4:2383 --advertise-peer-urls=http://172.17.0.4:2383 --client-urls=http://172.17.0.4:2384 --advertise-client-urls=http://172.17.0.4:2384 --log-file=/root/.tiup/data/lhrtidb/pd-2/pd.log --initial-cluster=pd-0=http://172.17.0.4:2380,pd-1=http://172.17.0.4:2381,pd-2=http://172.17.0.4:2383 root 6692 6634 2 14:24 pts/6 00:00:19 /root/.tiup/components/tikv/v5.3.0/tikv-server --addr=172.17.0.4:20160 --advertise-addr=172.17.0.4:20160 --status-addr=172.17.0.4:20180 --pd=http://172.17.0.4:2379,http://172.17.0.4:2382,http://172.17.0.4:2384 --config=/root/.tiup/data/lhrtidb/tikv-0/tikv.toml --data-dir=/root/.tiup/data/lhrtidb/tikv-0/data --log-file=/root/.tiup/data/lhrtidb/tikv-0/tikv.log root 6701 6634 2 14:24 pts/6 00:00:18 /root/.tiup/components/tikv/v5.3.0/tikv-server --addr=172.17.0.4:20161 --advertise-addr=172.17.0.4:20161 --status-addr=172.17.0.4:20181 --pd=http://172.17.0.4:2379,http://172.17.0.4:2382,http://172.17.0.4:2384 --config=/root/.tiup/data/lhrtidb/tikv-1/tikv.toml --data-dir=/root/.tiup/data/lhrtidb/tikv-1/data --log-file=/root/.tiup/data/lhrtidb/tikv-1/tikv.log root 6707 6634 2 14:24 pts/6 00:00:19 /root/.tiup/components/tikv/v5.3.0/tikv-server --addr=172.17.0.4:20162 --advertise-addr=172.17.0.4:20162 --status-addr=172.17.0.4:20182 --pd=http://172.17.0.4:2379,http://172.17.0.4:2382,http://172.17.0.4:2384 --config=/root/.tiup/data/lhrtidb/tikv-2/tikv.toml --data-dir=/root/.tiup/data/lhrtidb/tikv-2/data --log-file=/root/.tiup/data/lhrtidb/tikv-2/tikv.log root 6710 6634 2 14:24 pts/6 00:00:17 /root/.tiup/components/tidb/v5.3.0/tidb-server -P 4000 --store=tikv --host=172.17.0.4 --status=10080 --path=172.17.0.4:2379,172.17.0.4:2382,172.17.0.4:2384 --log-file=/root/.tiup/data/lhrtidb/tidb-0/tidb.log --config=/tmp/tidb_config.toml root 6719 6634 3 14:24 pts/6 00:00:21 /root/.tiup/components/tidb/v5.3.0/tidb-server -P 4001 --store=tikv --host=172.17.0.4 --status=10081 --path=172.17.0.4:2379,172.17.0.4:2382,172.17.0.4:2384 --log-file=/root/.tiup/data/lhrtidb/tidb-1/tidb.log --config=/tmp/tidb_config.toml root 7242 6634 3 14:25 pts/6 00:00:24 /root/.tiup/components/prometheus/v5.3.0/prometheus/prometheus --config.file=/root/.tiup/data/lhrtidb/prometheus/prometheus.yml --web.external-url=http://0.0.0.0:9090 --web.listen-address=0.0.0.0:9090 --storage.tsdb.path=/root/.tiup/data/lhrtidb/prometheus/data root 7243 6634 1 14:25 pts/6 00:00:12 /root/.tiup/components/prometheus/v5.3.0/ng-monitoring-server --pd.endpoints=172.17.0.4:2379,172.17.0.4:2382,172.17.0.4:2384 --address=0.0.0.0:12020 --advertise-address=0.0.0.0:12020 --storage.path=/root/.tiup/data/lhrtidb/prometheus/data --log.path=/root/.tiup/data/lhrtidb/prometheus/logs root 7288 6634 4 14:25 pts/6 00:00:29 /root/.tiup/components/grafana/v5.3.0/bin/grafana-server --homepath /root/.tiup/data/lhrtidb/grafana --config /root/.tiup/data/lhrtidb/grafana/conf/custom.ini cfg:default.paths.logs=/root/.tiup/data/lhrtidb/grafana/log root 7308 6634 11 14:25 pts/6 00:01:11 /root/.tiup/components/tiflash/v5.3.0/tiflash/tiflash server --config-file=/root/.tiup/data/lhrtidb/tiflash-0/tiflash.toml |
过程:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 | [root@docker35 ~]# docker rm -f lhrtidb docker run -d --name lhrtidb -h lhrtidb \ -p 44000-44001:4000-4001 -p 42379:2379 -p 49090:9090 -p 43000:3000 -p 43389:3389 \ -v /sys/fs/cgroup:/sys/fs/cgroup \ --privileged=true lhrbest/lhrcentos76:8.2 \ /usr/sbin/init docker exec -it lhrtidb bash lhrtidb [root@docker35 ~]# docker run -d --name lhrtidb -h lhrtidb \ > -p 44000-44001:4000-4001 -p 42379:2379 -p 49090:9090 -p 43000:3000 -p 43389:3389 \ > -v /sys/fs/cgroup:/sys/fs/cgroup \ > --privileged=true lhrbest/lhrcentos76:8.2 \ > /usr/sbin/init 240906cdbd1e5c6fd0b04e4648eaadc96fb09e7a1216ed24981fcf35e60b49f1 [root@docker35 ~]# [root@docker35 ~]# docker exec -it lhrtidb bash [root@lhrtidb /]# curl --proto '=https' --tlsv1.2 -sSf https://tiup-mirrors.pingcap.com/install.sh | sh % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 6064k 100 6064k 0 0 4030k 0 0:00:01 0:00:01 --:--:-- 4032k WARN: adding root certificate via internet: https://tiup-mirrors.pingcap.com/root.json You can revoke this by remove /root/.tiup/bin/7b8e153f2e2d0928.root.json Successfully set mirror to https://tiup-mirrors.pingcap.com Detected shell: bash Shell profile: /root/.bash_profile /root/.bash_profile has been modified to add tiup to PATH open a new terminal or source /root/.bash_profile to use it Installed path: /root/.tiup/bin/tiup =============================================== Have a try: tiup playground =============================================== [root@lhrtidb /]# source /root/.bash_profile [root@lhrtidb /]# tiup playground v5.2.1 --db 2 --pd 3 --kv 3 --monitor --host=0.0.0.0 & [1] 369 [root@lhrtidb /]# The component `playground` version is not installed; downloading from repository. download https://tiup-mirrors.pingcap.com/playground-v1.6.0-linux-amd64.tar.gz 6.46 MiB / 6.46 MiB 100.00% 6.58 MiB/s Starting component `playground`: /root/.tiup/components/playground/v1.6.0/tiup-playground v5.2.1 --db 2 --pd 3 --kv 3 --monitor --host=0.0.0.0 Flag --monitor has been deprecated, Please use --without-monitor to control whether to disable monitor. Playground Bootstrapping... The component `prometheus` version v5.2.1 is not installed; downloading from repository. download https://tiup-mirrors.pingcap.com/prometheus-v5.2.1-linux-amd64.tar.gz 39.84 MiB / 39.84 MiB 100.00% 2.54 MiB/s download https://tiup-mirrors.pingcap.com/grafana-v5.2.1-linux-amd64.tar.gz 49.99 MiB / 49.99 MiB 100.00% 2.52 MiB/s Start pd instance The component `pd` version v5.2.1 is not installed; downloading from repository. download https://tiup-mirrors.pingcap.com/pd-v5.2.1-linux-amd64.tar.gz 40.09 MiB / 40.09 MiB 100.00% 2.57 MiB/s Start pd instance Start pd instance Start tikv instance The component `tikv` version v5.2.1 is not installed; downloading from repository. download https://tiup-mirrors.pingcap.com/tikv-v5.2.1-linux-amd64.tar.gz 164.63 MiB / 164.63 MiB 100.00% 2.37 MiB/s Start tikv instance Start tikv instance Start tidb instance The component `tidb` version v5.2.1 is not installed; downloading from repository. download https://tiup-mirrors.pingcap.com/tidb-v5.2.1-linux-amd64.tar.gz 45.54 MiB / 45.54 MiB 100.00% 2.54 MiB/s Start tidb instance Waiting for tidb instances ready 172.17.0.22:4000 ... Done 172.17.0.22:4001 ... Done Start tiflash instance The component `tiflash` version v5.2.1 is not installed; downloading from repository. download https://tiup-mirrors.pingcap.com/tiflash-v5.2.1-linux-amd64.tar.gz 401.39 MiB / 401.39 MiB 100.00% 2.33 MiB/s Waiting for tiflash instances ready 172.17.0.22:3930 ... Done CLUSTER START SUCCESSFULLY, Enjoy it ^-^ To connect TiDB: mysql --comments --host 172.17.0.22 --port 4001 -u root -p (no password) To connect TiDB: mysql --comments --host 172.17.0.22 --port 4000 -u root -p (no password) To view the dashboard: http://172.17.0.22:2379/dashboard PD client endpoints: [172.17.0.22:2379 172.17.0.22:2382 172.17.0.22:2384] To view the Prometheus: http://172.17.0.22:9090 To view the Grafana: http://172.17.0.22:3000 [root@lhrtidb /]# [root@lhrtidb /]# ps -ef|grep tiup root 369 69 0 08:49 pts/0 00:00:02 tiup playground v5.2.1 --db 2 --pd 3 --kv 3 --monitor --host=0.0.0.0 root 385 369 10 08:49 pts/0 00:02:31 /root/.tiup/components/playground/v1.6.0/tiup-playground v5.2.1 --db 2 --pd 3 --kv 3 --monitor --host=0.0.0.0 root 418 385 2 08:50 pts/0 00:00:40 /root/.tiup/components/prometheus/v5.2.1/prometheus/prometheus --config.file=/root/.tiup/data/Sm3RCrS/prometheus/prometheus.yml --web.external-url=http://0.0.0.0:9090 --web.listen-address=0.0.0.0:9090 --storage.tsdb.path=/root/.tiup/data/Sm3RCrS/prometheus/data root 458 385 3 08:50 pts/0 00:00:51 /root/.tiup/components/grafana/v5.2.1/bin/grafana-server --homepath /root/.tiup/data/Sm3RCrS/grafana --config /root/.tiup/data/Sm3RCrS/grafana/conf/custom.ini cfg:default.paths.logs=/root/.tiup/data/Sm3RCrS/grafana/log root 488 385 7 08:50 pts/0 00:01:45 /root/.tiup/components/pd/v5.2.1/pd-server --name=pd-0 --data-dir=/root/.tiup/data/Sm3RCrS/pd-0/data --peer-urls=http://172.17.0.22:2380 --advertise-peer-urls=http://172.17.0.22:2380 --client-urls=http://172.17.0.22:2379 --advertise-client-urls=http://172.17.0.22:2379 --log-file=/root/.tiup/data/Sm3RCrS/pd-0/pd.log --initial-cluster=pd-0=http://172.17.0.22:2380,pd-1=http://172.17.0.22:2381,pd-2=http://172.17.0.22:2383 root 494 385 2 08:50 pts/0 00:00:35 /root/.tiup/components/pd/v5.2.1/pd-server --name=pd-1 --data-dir=/root/.tiup/data/Sm3RCrS/pd-1/data --peer-urls=http://172.17.0.22:2381 --advertise-peer-urls=http://172.17.0.22:2381 --client-urls=http://172.17.0.22:2382 --advertise-client-urls=http://172.17.0.22:2382 --log-file=/root/.tiup/data/Sm3RCrS/pd-1/pd.log --initial-cluster=pd-0=http://172.17.0.22:2380,pd-1=http://172.17.0.22:2381,pd-2=http://172.17.0.22:2383 root 500 385 2 08:50 pts/0 00:00:32 /root/.tiup/components/pd/v5.2.1/pd-server --name=pd-2 --data-dir=/root/.tiup/data/Sm3RCrS/pd-2/data --peer-urls=http://172.17.0.22:2383 --advertise-peer-urls=http://172.17.0.22:2383 --client-urls=http://172.17.0.22:2384 --advertise-client-urls=http://172.17.0.22:2384 --log-file=/root/.tiup/data/Sm3RCrS/pd-2/pd.log --initial-cluster=pd-0=http://172.17.0.22:2380,pd-1=http://172.17.0.22:2381,pd-2=http://172.17.0.22:2383 root 599 385 2 08:52 pts/0 00:00:28 /root/.tiup/components/tikv/v5.2.1/tikv-server --addr=172.17.0.22:20160 --advertise-addr=172.17.0.22:20160 --status-addr=172.17.0.22:20180 --pd=http://172.17.0.22:2379,http://172.17.0.22:2382,http://172.17.0.22:2384 --config=/root/.tiup/data/Sm3RCrS/tikv-0/tikv.toml --data-dir=/root/.tiup/data/Sm3RCrS/tikv-0/data --log-file=/root/.tiup/data/Sm3RCrS/tikv-0/tikv.log root 600 385 1 08:52 pts/0 00:00:26 /root/.tiup/components/tikv/v5.2.1/tikv-server --addr=172.17.0.22:20161 --advertise-addr=172.17.0.22:20161 --status-addr=172.17.0.22:20181 --pd=http://172.17.0.22:2379,http://172.17.0.22:2382,http://172.17.0.22:2384 --config=/root/.tiup/data/Sm3RCrS/tikv-1/tikv.toml --data-dir=/root/.tiup/data/Sm3RCrS/tikv-1/data --log-file=/root/.tiup/data/Sm3RCrS/tikv-1/tikv.log root 601 385 1 08:52 pts/0 00:00:26 /root/.tiup/components/tikv/v5.2.1/tikv-server --addr=172.17.0.22:20162 --advertise-addr=172.17.0.22:20162 --status-addr=172.17.0.22:20182 --pd=http://172.17.0.22:2379,http://172.17.0.22:2382,http://172.17.0.22:2384 --config=/root/.tiup/data/Sm3RCrS/tikv-2/tikv.toml --data-dir=/root/.tiup/data/Sm3RCrS/tikv-2/data --log-file=/root/.tiup/data/Sm3RCrS/tikv-2/tikv.log root 991 385 2 08:52 pts/0 00:00:26 /root/.tiup/components/tidb/v5.2.1/tidb-server -P 4000 --store=tikv --host=172.17.0.22 --status=10080 --path=172.17.0.22:2379,172.17.0.22:2382,172.17.0.22:2384 --log-file=/root/.tiup/data/Sm3RCrS/tidb-0/tidb.log root 997 385 2 08:52 pts/0 00:00:32 /root/.tiup/components/tidb/v5.2.1/tidb-server -P 4001 --store=tikv --host=172.17.0.22 --status=10081 --path=172.17.0.22:2379,172.17.0.22:2382,172.17.0.22:2384 --log-file=/root/.tiup/data/Sm3RCrS/tidb-1/tidb.log root 1205 385 11 08:55 pts/0 00:02:10 /root/.tiup/components/tiflash/v5.2.1/tiflash/tiflash server --config-file=/root/.tiup/data/Sm3RCrS/tiflash-0/tiflash.toml root 5170 69 0 09:14 pts/0 00:00:00 grep --color=auto tiup [root@lhrtidb /]# tiup --help bash: tiup: command not found [root@lhrtidb /]# source /root/.bash_profile [root@lhrtidb /]# more /root/.bash_profile # .bash_profile # Get the aliases and functions if [ -f ~/.bashrc ]; then . ~/.bashrc fi # User specific environment and startup programs PATH=$PATH:$HOME/bin export PATH export PATH=/root/.tiup/bin:$PATH [root@lhrtidb /]# cd /root/.tiup/bin [root@lhrtidb bin]# ll total 17924 -rw-r--r-- 1 root root 7275 Oct 16 21:21 7b8e153f2e2d0928.root.json -rw-r--r-- 1 root root 7275 Oct 16 21:21 root.json -rwxr-xr-x 1 root root 18337792 Oct 9 11:41 tiup [root@lhrtidb bin]# ll -h total 18M -rw-r--r-- 1 root root 7.2K Oct 16 21:21 7b8e153f2e2d0928.root.json -rw-r--r-- 1 root root 7.2K Oct 16 21:21 root.json -rwxr-xr-x 1 root root 18M Oct 9 11:41 tiup [root@lhrtidb bin]# tiup -h TiUP is a command-line component management tool that can help to download and install TiDB platform components to the local system. You can run a specific version of a component via "tiup <component>[:version]". If no version number is specified, the latest version installed locally will be used. If the specified component does not have any version installed locally, the latest stable version will be downloaded from the repository. Usage: tiup [flags] <command> [args...] tiup [flags] <component> [args...] Available Commands: install Install a specific version of a component list List the available TiDB components or versions uninstall Uninstall components or versions of a component update Update tiup components to the latest version status List the status of instantiated components clean Clean the data of instantiated components mirror Manage a repository mirror for TiUP components telemetry Controls things about telemetry env Show the list of system environment variable that related to TiUP help Help about any command or component completion generate the autocompletion script for the specified shell Components Manifest: use "tiup list" to fetch the latest components manifest Flags: -B, --binary <component>[:version] Print binary path of a specific version of a component <component>[:version] and the latest version installed will be selected if no version specified --binpath string Specify the binary path of component instance --help Help for this command --skip-version-check Skip the strict version check, by default a version must be a valid SemVer string -T, --tag string [Deprecated] Specify a tag for component instance -v, --version Print the version of tiup Component instances with the same "tag" will share a data directory ($TIUP_HOME/data/$tag): $ tiup --tag mycluster playground Examples: $ tiup playground # Quick start $ tiup playground nightly # Start a playground with the latest nightly version $ tiup install <component>[:version] # Install a component of specific version $ tiup update --all # Update all installed components to the latest version $ tiup update --nightly # Update all installed components to the nightly version $ tiup update --self # Update the "tiup" to the latest version $ tiup list # Fetch the latest supported components list $ tiup status # Display all running/terminated instances $ tiup clean <name> # Clean the data of running/terminated instance (Kill process if it's running) $ tiup clean --all # Clean the data of all running/terminated instances Use "tiup [command] --help" for more information about a command. [root@lhrtidb /]# tiup playground --help Bootstrap a TiDB cluster in your local host, the latest release version will be chosen if you don't specified a version. Examples: $ tiup playground nightly # Start a TiDB nightly version local cluster $ tiup playground v5.0.1 --db 3 --pd 3 --kv 3 # Start a local cluster with 10 nodes $ tiup playground nightly --monitor=false # Start a local cluster and disable monitor system $ tiup playground --pd.config ~/config/pd.toml # Start a local cluster with specified configuration file $ tiup playground --db.binpath /xx/tidb-server # Start a local cluster with component binary path $ tiup playground --mode tikv-slim # Start a local tikv only cluster (No TiDB or TiFlash Available) $ tiup playground --mode tikv-slim --kv 3 --pd 3 # Start a local tikv only cluster with 6 nodes Usage: tiup playground [version] [flags] tiup [command] Available Commands: completion generate the autocompletion script for the specified shell display help Help about any command scale-in scale-out Flags: --db int TiDB instance number --db.Host host Playground TiDB host. If not provided, TiDB will still use host flag as its host --db.Port int Playground TiDB port. If not provided, TiDB will use 4000 as its port --db.binpath string TiDB instance binary path --db.config string TiDB instance configuration file --db.timeout int TiDB max wait time in seconds for starting, 0 means no limit --drainer int Drainer instance number --drainer.binpath string Drainer instance binary path --drainer.config string Drainer instance configuration file -h, --help help for tiup --host string Playground cluster host --kv int TiKV instance number --kv.binpath string TiKV instance binary path --kv.config string TiKV instance configuration file --mode string TiUP playground mode: 'tidb', 'tikv-slim' (default "tidb") --pd int PD instance number --pd.Host host Playground PD host. If not provided, PD will still use host flag as its host --pd.binpath string PD instance binary path --pd.config string PD instance configuration file --pump int Pump instance number --pump.binpath string Pump instance binary path --pump.config string Pump instance configuration file -T, --tag string Specify a tag for playground --ticdc int TiCDC instance number --ticdc.binpath string TiCDC instance binary path --ticdc.config string TiCDC instance configuration file --tiflash int TiFlash instance number --tiflash.binpath string TiFlash instance binary path --tiflash.config string TiFlash instance configuration file --tiflash.timeout int TiFlash max wait time in seconds for starting, 0 means no limit -v, --version version for tiup --without-monitor Don't start prometheus and grafana component Use "tiup [command] --help" for more information about a command. [root@lhrtidb bin]# tiup list Available components: Name Owner Description ---- ----- ----------- PCC community A tool used to capture plan changes among different versions of TiDB bench pingcap Benchmark database with different workloads br pingcap TiDB/TiKV cluster backup restore tool cdc pingcap CDC is a change data capture tool for TiDB client pingcap Client to connect playground cluster pingcap Deploy a TiDB cluster for production ctl pingcap TiDB controller suite dm pingcap Data Migration Platform manager dmctl pingcap dmctl component of Data Migration Platform errdoc pingcap Document about TiDB errors pd-recover pingcap PD Recover is a disaster recovery tool of PD, used to recover the PD cluster which cannot start or provide services normally playground pingcap Bootstrap a local TiDB cluster for fun tidb pingcap TiDB is an open source distributed HTAP database compatible with the MySQL protocol tidb-lightning pingcap TiDB Lightning is a tool used for fast full import of large amounts of data into a TiDB cluster tiup pingcap TiUP is a command-line component management tool that can help to download and install TiDB platform components to the local system [root@lhrtidb /]# tiup status Name Component PID Status Created Time Directory Binary Args ---- --------- --- ------ ------------ --------- ------ ---- Sm3RCrS playground 385 RUNNING 2021-10-17T08:49:48+08:00 /root/.tiup/data/Sm3RCrS /root/.tiup/components/playground/v1.6.0/tiup-playground v5.2.1 --db 2 --pd 3 --kv 3 --monitor --host=0.0.0.0 [root@lhrtidb /]# [root@lhrtidb /]# mysql -uroot -p -h172.17.0.22 -P4000 Enter password: Welcome to the MySQL monitor. Commands end with ; or \g. Your MySQL connection id is 13 Server version: 5.7.25-TiDB-v5.2.1 TiDB Server (Apache License 2.0) Community Edition, MySQL 5.7 compatible Copyright (c) 2000, 2020, Oracle and/or its affiliates. All rights reserved. Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners. Type 'help;' or '\h' for help. Type '\c' to clear the current input statement. MySQL [(none)]> select tidb_version() \G *************************** 1. row *************************** tidb_version(): Release Version: v5.2.1 Edition: Community Git Commit Hash: cd8fb24c5f7ebd9d479ed228bb41848bd5e97445 Git Branch: heads/refs/tags/v5.2.1 UTC Build Time: 2021-09-08 02:32:56 GoVersion: go1.16.4 Race Enabled: false TiKV Min Version: v3.0.0-60965b006877ca7234adaced7890d7b029ed1306 Check Table Before Drop: false 1 row in set (0.05 sec) mysql> select STORE_ID,ADDRESS,STORE_STATE,STORE_STATE_NAME,CAPACITY,AVAILABLE,UPTIME from INFORMATION_SCHEMA.TIKV_STORE_STATUS; +----------+-----------------+-------------+------------------+----------+-----------+------------------+ | STORE_ID | ADDRESS | STORE_STATE | STORE_STATE_NAME | CAPACITY | AVAILABLE | UPTIME | +----------+-----------------+-------------+------------------+----------+-----------+------------------+ | 1 | 127.0.0.1:20160 | 0 | Up | 825.7GiB | 122.7GiB | 16m40.290861982s | | 4 | 127.0.0.1:20161 | 0 | Up | 825.7GiB | 122.7GiB | 16m40.291075257s | | 5 | 127.0.0.1:20162 | 0 | Up | 825.7GiB | 122.7GiB | 16m40.292138153s | | 121 | 127.0.0.1:3930 | 0 | Up | 825.7GiB | 825.7GiB | 12m50.39510302s | +----------+-----------------+-------------+------------------+----------+-----------+------------------+ 4 rows in set (0.00 sec) MySQL [(none)]> SELECT VERSION(); +--------------------+ | VERSION() | +--------------------+ | 5.7.25-TiDB-v5.2.1 | +--------------------+ 1 row in set (0.05 sec) MySQL [(none)]> select @@version_comment; +--------------------------------------------------------------------------+ | @@version_comment | +--------------------------------------------------------------------------+ | TiDB Server (Apache License 2.0) Community Edition, MySQL 5.7 compatible | +--------------------------------------------------------------------------+ 1 row in set (0.05 sec) mysql> exit Bye |