This is table condition. There are five fields.
My command is here to add new data.
Now my requirement is to add big data to this table, almost 700,000 rows.
At first, I want to optimize each field. For example, I try to use small field type. Int is better than char. Meanwhile, I use constant to define each field’e length.But the kind of optimization doesn’t have great influence on efficiency.And then, I try to create index on the table. Different index on different field would give variable result.
I do some experiments on it.
solution1: idx1:MovieID,Time,Count;idx2:MovieID,Time,Rating time: 846.411153
solution2: idx1:Time,MovieID,Count;idx2:Time,MovieID,Rating time: 855.032292
solution3:idx1:Count;idx2:Rating;idx3:MovieID, Time time: 840.703046
solution4:idx1:Count;idx2:MovieID, Time time: 826.489348
- focus on query
- focus on sort
- don’t create unused index
- understand composite index
- indexes are not just for the primary keys or the unique keys. If there are any columns in your table that you will search it, you should almost always index them.
- The rule applies on a partial search like “last_name LIKE ‘A%'”. When searching from the beginning of the string, MySQL is able to utilize the index on that column.
- Some of searches can’t use the regular indexes. For instance, when searching for a world, like “WHERE post_content LIKE ‘%apple%'”, you will not see a benefit from anormal index.
- If your application contains many JOIN queries, you need to make sure that the columns you join by are indexed on both tables.