Open In App

Minkowski Distance

Last Updated : 23 Jul, 2025
Comments
Improve
Suggest changes
Like Article
Like
Report

Minkowski Distance is a generalized metric that unifies various distance measures used in mathematics and machine learning. It provides a flexible way to compute distances between points in an n-dimensional space.

It is a powerful distance function that encompasses several well-known distance metrics, such as Manhattan Distance and Euclidean Distance, as special cases, with different values of p leading to well-known distances such as Manhattan Distance (p=1), Euclidean Distance (p=2), and Chebyshev Distance (p→∞).

The Minkowski Distance between two points A = (A1, A2, …, An) and B = (B1, B2, …, Bn) in an n-dimensional space is given by:

D(A, B) = \left( \sum_{i=1}^{n} |A_i - B_i|^p \right)^{\frac{1}{p}}

where p is a parameter that determines the nature of the distance metric.

Generalization of Other Distance Metrics

The Minkowski Distance can represent different types of distances based on the value of p:

1. Manhattan Distance (p = 1)

D(A, B) = \sum_{i=1}^{n} |A_i - B_i|

Also called Taxicab Distance, it represents movement along grid-based paths, such as city blocks where diagonal movement is restricted.

2. Euclidean Distance (p = 2)

D(A, B) = \sqrt{\sum_{i=1}^{n} (A_i - B_i)^2}

This is the straight-line distance between two points, making it the most commonly used metric in geometry, physics, and machine learning.

3. Chebyshev Distance (p→∞)

D(A, B) = \max_{i} |A_i - B_i|

This measures the maximum absolute coordinate difference, often used in chess for king movement and in warehouse logistics for max-step constraints.

min-min
Distance Metrics

Numerical Example

Consider two points in 3D space:

A = (2, 5, 8), B = (3, 1, 6)

For Manhattan Distance (p = 1)

D(A, B) = |2 - 3| + |5 - 1| + |8 - 6| = 1 + 4 + 2 = 7

For Euclidean Distance (p = 2)

D(A, B) = \sqrt{(2 - 3)^2 + (5 - 1)^2 + (8 - 6)^2}

D(A, B) = \sqrt{1 + 16 + 4} = \sqrt{21} \approx 4.58

For Chebyshev Distance (p→∞)

D(A, B) = \max(|2 - 3|, |5 - 1|, |8 - 6|) = \max(1, 4, 2) = 4

Mathematical Properties

1. Non-negativity

D(A, B) \geq 0

Distance is always non-negative, meaning it can never be less than zero, ensuring a valid metric for measuring separations between points.

2. Identity of Indiscernibles

D(A, B) = 0 \iff A = B

The distance between two points is zero if and only if they are identical, ensuring that distinct points always have a nonzero distance.

3. Symmetry

D(A, B) = D(B, A)

The distance remains unchanged when the order of points is swapped, making the metric independent of direction.

4. Triangle Inequality (Only for p≥1)

D(A, C) \leq D(A, B) + D(B, C)

This ensures that taking a direct path between two points is always the shortest, reinforcing the foundational property of metric spaces.

5. Generalization

\lim_{p \to \infty} D(A, B) = \max_{i} |A_i - B_i|

As p increases, the Minkowski Distance approaches the Chebyshev Distance, meaning only the largest coordinate difference dominates the overall distance measure.

Implementation in Python

Below is the Python implementation of Minkowski Distance, including visualization:

Python
import numpy as np
import matplotlib.pyplot as plt

def minkowski_distance(A, B, p):
    """Compute Minkowski Distance between two points A and B for given p."""
    return np.power(np.sum(np.abs(np.array(A) - np.array(B)) ** p), 1 / p)

# Sample points
A = np.array([2, 5, 8])
B = np.array([3, 1, 6])

# Compute distances for different values of p
p_values = [1, 2, 10]
distances = [minkowski_distance(A, B, p) for p in p_values]

# Print results
for p, d in zip(p_values, distances):
    print(f"Minkowski Distance (p={p}): {d:.4f}")

# Visualization
plt.figure(figsize=(6, 4))
plt.plot(p_values, distances, marker='o', linestyle='-', color='b')
plt.xlabel("p (Minkowski Parameter)")
plt.ylabel("Distance")
plt.title("Minkowski Distance for Different p Values")
plt.grid(True)
plt.show()

Output:

imp
Minkowski Parameters

The plot shows how Minkowski Distance varies for different p values.

Real-Life Applications

1. Machine Learning & Clustering

Used in K-Nearest Neighbors (KNN) and K-Means Clustering to measure the similarity between data points.

2. Image Processing

Used in measuring similarity between images, especially in image retrieval systems.

3. Finance & Risk Analysis

Employed in portfolio analysis to compute risk distances between financial assets.

4. Robotics & Path Planning

Manhattan and Euclidean distances (special cases) are widely used in robot motion planning and autonomous navigation.

Related Articles


Article Tags :

Similar Reads