How to sort elements by values with Spark in Java

Spark only allows sorting by keys and not by values. Sorting by values can be achieved mapping to a new pair with the key as the value and the value as the key. Then sorting by key and rendering the output.


long start = System.nanoTime();
JavaPairRDD<String, Stats0> extracted = dataSet1.mapToPair(s -> new Tuple2<>(getIp(s), new Stats0(1)));
JavaPairRDD<String, Stats0> baseKeyPair = extracted.reduceByKey(Stats0::merge);

// Map for sorting
JavaPairRDD<Integer, Tuple2<String, Stats0>> sortingRDD = baseKeyPair
		.mapToPair(t ->new Tuple2<>(t._2().getCount(), t));

// Sort by keys
sortingRDD = sortingRDD.sortByKey(false);

// Collect to display the output
List<Tuple2<Integer, Tuple2<String, Stats0>>> output = sortingRDD.collect();

end = System.nanoTime();
for (Tuple2<Integer, Tuple2<String, Stats0>> t : output) {
	System.out.println(t._2()._1 + "\t" + t._1());

System.out.println("Processed in : " + (int) (end - start)/1000000 + " ms");

This Java example is using the class Stats0, that is a wrapper around an integer. This can be customized to use a generic comparator, and this way use any Object type as long as it is serializable.

public static class Stats0 implements Serializable {

	private final int count;

	public Stats0(int count) {
		this.count = count;

	public Stats0 merge(Stats0 other) {
		return new Stats0(count + other.count);

	public int getCount() {
		return count;

	public String toString() {
		return String.format("n=%s", count);



Recent Comments