0% found this document useful (0 votes)
4 views

Checklist

The document outlines optimization strategies for database tables and queries, emphasizing the importance of proper indexing and character set consistency. It provides specific checkpoints for improving query performance, such as using explain statements to analyze query costs and avoiding costly subqueries. Additionally, it highlights best practices for index creation and the limitations of non-clustered indexes.

Uploaded by

dha3medari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Checklist

The document outlines optimization strategies for database tables and queries, emphasizing the importance of proper indexing and character set consistency. It provides specific checkpoints for improving query performance, such as using explain statements to analyze query costs and avoiding costly subqueries. Additionally, it highlights best practices for index creation and the limitations of non-clustered indexes.

Uploaded by

dha3medari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Table Optimisation

S.No Check Point Current Status Mitigation / Suggestion Action


/ Comments

1. Check if indexes are being Make sure the


created for the table columns used while
columns which are used in retrieving, in the where
where clause or in joins clause are part of
primary key/indexed.

2. Check if multiple tables in Use same charset for Verify if all the tables
the query have different all the tables within the database have
charset as indexing will fail same charset
in this case

Query Optimisation

S.No Check Point Current Mitigation / Suggestion / Action


Status Comments

1. execute explain format= This will give details of


json <query> query cost which is an
important factor when
deciding if query is slow or
not, 0-1000 range of query
cost implies query is quite
optimal, 1000-100000
medium query, > 100000
query needs to be
optimized

2. Run the query with


explain statement
1) Check if the
filtered column is
100% or ~ 100%
2) Check if possible
keys and keys
used are matching
3) Check if complete
scan is happening
4) Check if the type is
ALL for any row in
the explain table

3. Check if the columns


accessed frequently have
the first priority order/
cardinality number in the
primary key (in the where
clause left to right order)

4. Check if there are sub try to replace sub queries


queries with joins as sub queries
are costlier operations

5. Check if duplicate avoid duplicate aggregate


aggregate functions are functions, recommend to do
used all the aggregation in the
sub queries and avoid outer
layers

6. Check if joins are used in Recommended order of


the query joins inner join > left outer
join > right outer join >
cross join '>' indicates
better performance

Points to be Noted :

1. Queries that are retrieving distinct columns cannot have indexes created so it's bound to
perform slow
2. For columns with json data type index will be hash index ( not usually allowed)
3. key_len indicates number of bytes consumed by the index, try not to create too many
non-clustered indexes, as mysql engine adds up the length along with primary index and
then execute the query
4. Maximum number of non-clustered indexes that can be created are 64
5. Try to avoid aggregate function as much as possible

You might also like