用大量的测试数据填充数据库表[英] Fill database tables with a large amount of test data

本文是小编为大家收集整理的关于用大量的测试数据填充数据库表的处理/解决方法,可以参考本文帮助大家快速定位并解决问题,中文翻译不准确的可切换到English标签页查看源文。

问题描述

我需要加载一个带有大量测试数据的表.这将用于测试性能和缩放.

如何为数据库表创建100,000行的随机/垃圾数据?

推荐答案

您也可以使用存储过程.以下表为例:

CREATE TABLE your_table (id int NOT NULL PRIMARY KEY AUTO_INCREMENT, val int);

然后您可以添加这样的存储过程:

DELIMITER $$
CREATE PROCEDURE prepare_data()
BEGIN
  DECLARE i INT DEFAULT 100;

  WHILE i < 100000 DO
    INSERT INTO your_table (val) VALUES (i);
    SET i = i + 1;
  END WHILE;
END$$
DELIMITER ;

当您调用它时,您将拥有100K记录:

CALL prepare_data();

其他推荐答案

对于多行克隆(数据重复),您可以使用

DELIMITER $$
CREATE PROCEDURE insert_test_data()
BEGIN
  DECLARE i INT DEFAULT 1;

  WHILE i < 100000 DO
    INSERT INTO `table` (`user_id`, `page_id`, `name`, `description`, `created`)
    SELECT `user_id`, `page_id`, `name`, `description`, `created`
    FROM `table`
    WHERE id = 1;
    SET i = i + 1;
  END WHILE;
END$$
DELIMITER ;
CALL insert_test_data();
DROP PROCEDURE insert_test_data;

其他推荐答案

在这里,它是纯数学和SQL的解决方案:

create table t1(x int primary key auto_increment);
insert into t1 () values (),(),();

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 1265 rows affected (0.01 sec)
Records: 1265  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 2530 rows affected (0.02 sec)
Records: 2530  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 5060 rows affected (0.03 sec)
Records: 5060  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 10120 rows affected (0.05 sec)
Records: 10120  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 20240 rows affected (0.12 sec)
Records: 20240  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 40480 rows affected (0.17 sec)
Records: 40480  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 80960 rows affected (0.31 sec)
Records: 80960  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 161920 rows affected (0.57 sec)
Records: 161920  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 323840 rows affected (1.13 sec)
Records: 323840  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 647680 rows affected (2.33 sec)
Records: 647680  Duplicates: 0  Warnings: 0

本文地址:https://www.itbaoku.cn/post/597445.html

问题描述

I need to load a table with a large amount of test data. This is to be used for testing performance and scaling.

How can I easily create 100,000 rows of random/junk data for my database table?

推荐答案

You could also use a stored procedure. Consider the following table as an example:

CREATE TABLE your_table (id int NOT NULL PRIMARY KEY AUTO_INCREMENT, val int);

Then you could add a stored procedure like this:

DELIMITER $$
CREATE PROCEDURE prepare_data()
BEGIN
  DECLARE i INT DEFAULT 100;

  WHILE i < 100000 DO
    INSERT INTO your_table (val) VALUES (i);
    SET i = i + 1;
  END WHILE;
END$$
DELIMITER ;

When you call it, you'll have 100k records:

CALL prepare_data();

其他推荐答案

For multiple row cloning (data duplication) you could use

DELIMITER $$
CREATE PROCEDURE insert_test_data()
BEGIN
  DECLARE i INT DEFAULT 1;

  WHILE i < 100000 DO
    INSERT INTO `table` (`user_id`, `page_id`, `name`, `description`, `created`)
    SELECT `user_id`, `page_id`, `name`, `description`, `created`
    FROM `table`
    WHERE id = 1;
    SET i = i + 1;
  END WHILE;
END$$
DELIMITER ;
CALL insert_test_data();
DROP PROCEDURE insert_test_data;

其他推荐答案

Here it's solution with pure math and sql:

create table t1(x int primary key auto_increment);
insert into t1 () values (),(),();

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 1265 rows affected (0.01 sec)
Records: 1265  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 2530 rows affected (0.02 sec)
Records: 2530  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 5060 rows affected (0.03 sec)
Records: 5060  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 10120 rows affected (0.05 sec)
Records: 10120  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 20240 rows affected (0.12 sec)
Records: 20240  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 40480 rows affected (0.17 sec)
Records: 40480  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 80960 rows affected (0.31 sec)
Records: 80960  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 161920 rows affected (0.57 sec)
Records: 161920  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 323840 rows affected (1.13 sec)
Records: 323840  Duplicates: 0  Warnings: 0

mysql> insert into t1 (x) select x + (select count(*) from t1) from t1;
Query OK, 647680 rows affected (2.33 sec)
Records: 647680  Duplicates: 0  Warnings: 0